Friday, March 31, 2023

Forbidden Science

 What still belongs to Science and what does not? Who is to decide what is science and what is para-science or pseudo-science? Some kinds of research are welcomed at one university, but not at another. There are respected scientists, some of whom are Nobel Prize winners, who are ostracized, most of the time by gossip, by various covert activities of their colleagues, simply because they dare to ask questions and research phenomena that others consider as “unworthy”. I have already mentioned several such cases, one example being the treatment of the “strange interests” of Alfred Wallace by Encyclopedia Universalis.

Someone - we do not know who it was - decided that a major part of the research of a distinguished scientist should be suppressed – the public should not be told about it, that it is better to tell a lie than to tell an inconvenient truth.

Science in Secret?

A friend of mine, a distinguished French scientist, who is interested in many “esoteric” areas, tells me that one should keep these interests to oneself, otherwise one will be punished; covert actions of others will destroy your scientific career; and that is what he does – he will discuss certain things in private, but will never dare to say them in public. What kind of science forces scientists to work in secret, from fear? What kind of society gives birth to that kind of science?

William Crookes

While reading the remarkable autobiography of Alfred Wallace, Darwin’s colleague, the co-discoverer - if not the original discoverer of the mechanisms of evolution, I found the following interesting paragraph:

During the years 1870-80 I had many opportunities of witnessing interesting phenomena in the houses of various friends, some of which I have not made public. Early in 1874 I was invited by John Morley, then editor of the Fortnightly Review, to write an article on "Spiritualism" for that periodical. Much public interest had been excited by the publication of the Report of the Committee of the Dialectical Society, and especially by Mr. Crookes's experiments with Mr. Home, and the refusal of the Royal Society to see these experiments repeated. (Italics, mine.)

Who is Mr. Crookes? And what were these experiments that the Royal Society did not even want to witness? Remember: curiosity is a condition “sine qua non” of a true scientist! The Royal Society was not curious? Why? Perhaps the experiments of Mr. Crookes were not worthy of the attention of the learned society, because they did not suggest anything new?

To know more about the issues at hand, the first thing to do is to check: who was Mr. Crookes? According to Encyclopedia Universalis Sir William Crookes (1832-1919) was an English chemist, who inherited quite a fortune, so that he could do his research to some extent independently. He discovered thallium, and invented the first radiometer, known today as the “Crookes tube”.

Crookes tube

From Encyclopedia Britannica we learn, additionally, that William Crookes was knighted in 1897. Searching the net we can find, in particular,  an impressive list of awards:

Past President, Chemical Society, Brit. Assoc., Inst. Elect. Eng., Soc. Chem. Industry; Hon. Member, Roy. Phil. Soc. Glasgow, Roy. Soc. NSW, Pharm. Soc., Chem. Metall. and Mining Soc. of South Africa, Amer. Chem. Soc., Amer. Philos. Soc., Roy. Soc. Sci. Upsala, Deutsch. Chem. Gesell. Berlin, Psychol. Soc. Paris, Antonio Alzate Sci. Soc. Mexico, Sci. Soc. Bucharest, Reg. Accad. Zelanti; Foreign Mem. Accad. Lincei, Rome; Corresp. Inst. de France (Acad. Sci.), Corresp. Mem. Bataafsch Genoots., Rotterdam, Soc. Encouragement pour l'Indust. Paris; For. Assoc. National Acad. Sciences, Washington; Foreign Mem., Royal Swedish Academy of Sciences. International Exhibition, 1862, medal; Acadèmie des Sciences, 1880, gold medal and prize of 3000 frs; Electrical Exhibitions, Paris, 1881, medal; Society of Arts, 1885, Fergusson Gold Medal; Exposition Universelle, Paris, 1889, medal; Society of Arts, 1899, Albert Gold Medal; Franklin Institute, Philadelphia, 1912, Elliott Cresson Gold Medal; Soc. Chem. Industry, 1912, gold medal. Royal medallist, Davy medallist, Copley medallist, and three times Bakerian Lecturer of the Royal Society.

I feel that I need to explain my reasons for including this long list of Crookes’ awards here. Do awards count? Do titles count? Facts are the only things that count - one may argue. Well, awards are also facts, and, most of the time, awards are given in recognition of someone’s skills and achievements.

Next post: You Shall Know Them by Their Fruits

P.S.1. 01-04-23 A friend, physicist, sent me this morning a link to this video by Rupert Sheldrake: 


Rupert Sheldrake - The Science Delusion BANNED TED TALK

And there, in particular:

00:09:53
But I want to spend a few moments on the constants of nature too. Because these are, again, assumed to be constant. Things like the gravitational constant of the speed of light are called the fundamental constants. Are they really constant? Well, when I got interested in this question, I tried to find out. They're given in physics handbooks. Handbooks of physics list the existing fundamental constants, tell you their value. But I wanted to see if they'd changed, so I got the old volumes of physical handbooks. I went to the patent office library here in London - they're the only place I could find that kept the old volumes. Normally people throw them away when the new values (volumes) come out, they throw away the old ones. When I did this I found that the speed of light dropped between nineteen twenty-eight and nineteen fourty-five by about twenty kilometers per second. It's a huge drop because they're given with errors of any fractions of a second/decimal points of error. And yet, all over the world, it dropped, and they were all getting very similar values to each other with tiny errors. Then in nineteen fourty-eight, it went up again. And then people started getting very similar values again. I was very intrigued by this and I couldn't make sense of it, so I went to see the head of metrology at the National Physical Laboratory in Eddington. Metrology is the science in which people measure constants. And I asked him about this, I said "what do you make of this drop in the speed of light between nineteen twenty-eight and nineteen fourty-five?" And he said "oh dear", he said "you've uncovered the most embarrassing episode in the history of our science."

So I said "well, could the speed of light have actually dropped? And that would have amazing implications if so." He said "no, no, of course it couldn't have actually dropped. It's a constant!" "Oh, well then how do you explain the fact that everyone was finding it going much slower during that period? Is it because they were fudging their results to get what they thought other people should be getting and the whole thing was just produced in the minds of physicists?" "We don't like to use the word 'fudge'."
I said "Well, so what do you prefer?" He said "well, we prefer to call it 'intellectual phase-locking'." So I said "well if it was going on then, how can you be so sure it's not going on today? And the present values produced are by intellectual phase-locking?" And he said "oh we know that's not the case."
And I said "how do we know?" He said "well", he said "we've solved the problem." And I said "well how?"
And he said "well we fixed the speed of light by definition in nineteen seventy-two."
So I said "but it might still change." He said "yes, but we'd never know it, because we've defined the metre in terms of the speed of light, so the units would change with it!"
So he looked very pleased about that, they'd fixed that problem.
But I said "well, then what about big G?" The gravitational constant, known in the trade as "big G", it was written with a capital G. Newton's universal gravitational constant.
"That's varied by more than 1.3% in recent years. And it seems to vary from place to place and from time to time." And he said "oh well, those are just errors. And unfortunately there are quite big errors with big G."
So I said "well, what if it's really changing? I mean, perhaps it is really changing." And then I looked at how they do it, what happens is they measure it in different labs, they get different values on different days, and then they average them. And then other labs around the world do the same, they come out usually with a rather different average. And then the international committee of metrology meets every ten years or so and average the ones from labs all around the world to come up with the value of big G. But what if G were actually fluctuating? What if it changed? There's already evidence actually that it changes throughout the day and throughout the year. What if the earth, as it moves through the galactic environment went through patches of dark matter or other environmental factors that could alter it? Maybe they all change together. What if these errors are going up together and down together? For more than ten years I've been trying to persuade metrologists to look at the raw data. In fact I'm now trying to persuade them to put it up online, on the internet. With the dates, and the actual measurements, and see if they're correlated. To see if they're all up at one time, all down at another. If so, they might be fluctuating together. And what would tell us something very, very interesting. But no-one has done this, they haven't done it because G is a constant. There's no point looking for changes. 

P.S.2.  April 1, 2023 17:22 Quantum Future is getting closer:




Thursday, March 30, 2023

Language Barriers Make Knowledge Barriers

 Of course for those who can’t read English there will be additional problems. Analysis shows, for instance, that:

Throughout the 20th century, international communication has shifted from a plural use of several languages to a clear pre-eminence of English, especially in the field of science. This paper focuses on international periodical publications where more than 75 percent of the articles in the social sciences and humanities and well over 90 percent in the natural sciences are written in English. The shift towards English implies that an increasing number of scientists whose mother tongue is not English have already moved to English for publication. Consequently, other international languages, namely French, German, Russian, Spanish and Japanese lose their attraction as languages of science. Many observers conclude that it has become inevitable to publish in English, even in English only.

Here is the graph taken from globaldev blog publication "Removing language barriers for better science":

Figure 1: Shares of languages in science publications, 1880–2005: overall average percentage for biology, chemistry, medicine, physics, and mathematics. Sources: Tsunoda 1983; Ammon 1998; the author’s own analysis, with the help of Abdulkadir Topal and Vanessa G. Figure from: Ammon, U., 2010. p.115

In countries such as France, where great emphasis is placed on everyone speaking the same language, and the language is held up as the proof of allegiance to “French values”, (whether that is consciously or unconsciously felt by the population), and where the educational system is such that languages are mostly taught following archaic methods which usually prevent citizens from being able to have even simple conversations with English speakers, even after years of English lessons at school, there is a serious and growing isolation caused by the very small number of French people who learn English. This is dangerous to science in France and ultimately dangerous to France itself. 

One of the main examples of this is the statistical evidence that France is at least 40 years behind other Western countries in social and humanistic sciences. In the top 100 world universities, France has four entries in the rankings: PSL University (Paris Sciences & Letters) comes in as 26, Institut Polytechnique de Paris as 48, Sorbonne as 60, and Université Paris-Saclay comes in at 69. That is a shocking fact for the country that is the home of “liberte, egalite, fraternite.” Of the top 10 universities in the world, 5 are in the U.S., 4 in UK, and one in Switzerland (ETH). If France intends to catch up, it’s going to have to speak English.


P.S.1 31-03-23 Reading now with excitement  R. Beneduci, F. Schroeck "Space localization of the photon" (2019):

Abstract

Starting from the phase space representation of quantum mechanics we provide an Euclidean system of covariance for the photon. In particular, we consider systems with the Poincaré group as the symmetry group and use a standard procedure in order to build a phase space and a localization observable on the phase space. Then we focus on the massless representations of the Poincaré group that we use to build a space localization observable for the photon.

Next post: Forbidden Science

Tuesday, March 28, 2023

Clifford’s Solution

William Kingdon Clifford, a great British mathematician and also a philosopher, wrote in his essay “Ethics of Belief”:

To sum up: it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.

Even if we accept the above, then we have the next pending question: when can – or should - the evidence be considered as sufficient? Suppose we do want to know the Truth, how do we get to it? Perhaps Truth is so far away that we will never get there? Perhaps it is impossible? We are human beings, with our limited senses, our limited capacities, limited time and limited resources.1 Can we ever really get to the Truth? Isn’t it better, simpler, more efficient, to stay with just what “resonates with us” – as many New Agers declare – and be done with it?



Somehow it is almost automatic that when we meet two opinions on a given subject, one contradicting the other, then we tend to think that the truth is somewhere in the middle. But is that always the case? What if one person is lying? What if one of the persons has mental problems, or is being somehow rewarded for distorting the truth, while the other one is totally honest? In order to avoid making errors in our judgments we should always ask the question “Who says so? And go to the very sources, check their reliability, collect as much information as possible.

Nowadays, thanks to the internet, this is possible for even ordinary people as it never was before in history. At the same time, perhaps those who do not want the truth known are just as busy confusing the matter by publishing disinformation? Yet with patience and some experience, with a will, and sometimes helpers, there is a way. The key is in going to the sources and checking them carefully. Of course sometimes there will be a language barrier, but with automatic translators even this difficulty can be dealt with.

While we may not be able to get to the whole Truth, quite often, with a little effort we can do much better than we usually do. As I wrote above: First of all we should always try to check the sources rather than be satisfied with second hand information.

Consider the example of Encyclopedia Universalis and its erroneous image of Alfred Wallace’s scientific interests.2 At the end of the article we find the source citation: “En 1905, il écrit son autobiographie, My Life.” We then go to the Google website archive.org, and search there for “Alfred Wallace My life”. The text is available on the internet and the search for “spiritual” brings us to the sentence: “In 1866 I wrote a pamphlet, entitled "The Scientific Aspect of the Supernatural," which I distributed amongst my friends.” In 1866 Wallace was 43 years old and he lived 90 years. Evidently Encyclopedia Universalis is spreading false information when it states that it was only at the end of his life that Wallace became interested in “esoteric” phenomena. How many other instances are there like this? Millions? Billions? What does that mean for truth when the so-called arbiters of truth are shown to actually lie?

We are living in the age of Internet, with easy access to many sources. Always check sources, and when you see a piece written by someone, even by an “authority”, whether or not they quote or cite sources – consider it as just an opinion, not as a “proof”. If there are sources listed – check them, because it often happens that only selective information is being presented, not the whole picture.

1Just how limited our resources are, and how hard it really is to get to the truth, is exposed scientifically by Nobel Laureate, Daniel Kahneman in his previously cited book, “Thinking, fast and slow”.

2http://www.universalis.fr/encyclopedie/alfred-russel-wallace/


Coming next: Language Barriers Make Knowledge Barriers

P.S.1. This morning I received email from  Dr Gina Langan inviting me to " become a Guardian of Logos, protector of Absolute Truth." 

I followed my own advice "when you see a piece written by someone, even by an “authority”, whether or not they quote or cite sources – consider it as just an opinion, not as a “proof”." 

What I have found is that Dr. Gina Langan won the 1989 Belgian Women's Championship and she  completed bachelor's, master's, and doctoral degrees in clinical psychology at Wayne State University in less than five years

Not enough information for me to decide if I want to join the "Guardians of Logos and protectors of Absolute Truth". But I am seriously considering the offered seven days free trial.

P.S.2.  Finished reading “Atlas Shrugged” by Ayn Rand. What I liked is the “happy end”. What strikes me is the evident dislike of any kind of mysticism - indiscriminately. At the end the group of co-linear “good”, “thinking” and “creative” people” escapes the fate of  the totally corrupt system. I wander what would Ayn Rand say about today's political and economic reality that would have yo include China?

Chris Langan seems to be more careful when talking about mysticism. For instance he writes:

"One cannot be an enemy of Truth and a servant of God. By hating truth and serving evil even under the aegis of organized religion, one earns the same fate as that of atheists who misidentify God as a misdefined version of "science". Worshipping a false God is as bad as, or worse than, worshipping no God at all."

But for him God stands for "Global Operator-Descriptor (GOD)”"

P.S.3. 29-03-23 9:11 Reading E. Prugovecki, "Dawn of the New Man". I had no idea what kind of a book it will be. Surprise-surprise! Here is a piece:

"As Kant had pointed out in his Critique of Pure Reason, there are synthetic judgments that transcend reason and empirical proof. One had to undergo the kind of mystical experience that Anita had allowed me to share with her in order to achieve such deep faith in the Destiny of Man as she already had.

I was also reminded of the metaphysical visions of Plotinus-the last of the great Neoplatonists-who more than two millennia earlier wrote that, when we are “divinely possessed and inspired,” we see not only the nous, the Spirit, but also the One, the Divine. And when we are thus in contact with the Divine, we cannot reason or express the vision in words: “At the moment of touch there is no power whatever to make any affirmation; there is no leisure; reasoning upon the vision is for afterwards.”

Could it be-I asked myself while savoring the memory of my mental and physical union with Anita-that Plotinus’s experiences of “ecstasy” were some form of precognition, rather than just the dreams of a gentle and noble mind turning away from the spectacle of ruin and misery in the world in which he actually lived?" 

and then:

"“I think that what Liu as driving at is the old mind-body philosophical question,” intervened Leonardo. “How does the mind interact with the body? And what is free will?”

Most interesting!"

Further in the same book:

"The lowest estimates on the number of civilization in our galaxy run into the millions. But many might be aquatic ones, not even fully aware of the vast Universe surrounding them. Others might be totally satisfied with their lives on the planets they inhabit. And amongst those that are not, many might be so far ahead of us that they are not interested in making contact with such a puny race as mankind still is. Among those that might be at our stage of development, the enormity of the spatial distances separating them from us and the limit imposed on space travel by the speed of light provides a very effective barrier against direct contact.”

“But what about worm holes, hyperspace, or superstring spatio-temporal dimensions that might make such contacts almost instantaneously possible?” “Ha! Ha!” laughed Ahmed. “I would have thought that as a quantum cosmologist you were well aware that all those Old Era speculations were just mathematical junk, without any firm physical foundation. I wonder why serious scientists ever published such nonsense in the Old Era, especially since their mathematics was even worse than their physics”

I shook my head sadly. “Not only published, but vigorously promoted it. That’s why I stopped being a quantum cosmologist in the Old Era: the so-call ‘leading scientists’ were selling what they sometimes called ‘sexy’ theories basically the same way whores were selling their charms in red light districts. The ‘glitzier’ an idea sounded, regardless of whether it was mathematically and physically sound, the easier it was to sell it!”

.....

"I shrugged. “Those were very different times, Ahmed. One leading physicist in those times by the name of Richard Feynman, who was more astute than the rest, talked about the ‘pack effect’: the pre-dilection of the mainstream scientists during the closing decades of the Old Era to blindly follow fashions dictated by a few self-appointed leaders, regardless of the intrinsic merits of the theories they were advancing. And, as the great physicist Werner Heisenberg noted in his very last paper, published in 1976 O.E., even when the advanced physical theories might have been good, they were ‘spoiled’ by ‘very poor philosophy.’ "

But there is a hope:

"... telepathic signals were not subject to Einstein’s relativistic laws"

...

"The prevalent theory that paranormal scientists subscribed to was that since spacetime was fundamentally quantum rather than classical in its nature, its spacetime events were not “pointlike,” but stochastically extended in their structure. Thus, strictly speaking, at the quantum level there was not a sharp dividing line between past, present and future even locally. Telepathic communication was supposed to be able to take advantage of this fact by deforming the local probabilistic potentialities into stochastically more extended forms, thus enabling meaningful, albeit sporadic and only stochastically reliable, instantaneous communication over enormous spacetime separations."

I like it!


P.S.4. 30-03-23 8:46 Finished reading E. Prugovecki, "Dawn of the New Man". Lot of "romantic" stuff there, quite unexpected! It is a pity that with so much much of stressing telepathy and "mind-merging" our Quantum Cosmologist has nothing to say about mathematics and physics of the future. society, where mind reading plays such an important role. 
Moving back to reading McGilchrist - "The Matter with Things: Our Brains, Our Delusions, and the Unmaking of the World" (2021). And to Chris Langan's Metatheories. Very important.

P.S.5. 30-03-23 12:26 Two brain hemispheres with complementary roles. To survive eat but not get eaten. Complementarity and duality. Matter-antimatter, universe - anti-verse, so below as above (but not exactly "so"), mind and matter, space and time, waves and particles, bosons and fermions ... what else? Other examples and parallels?

P.S.6 12:52: The Epoch Times: "“Contemporary AI systems are now becoming human-competitive at general tasks and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? ..."

P.S.7. 12:55 Reading E. Prugovetski "Stochastic phase space kinematics of the photon" (1978). It seems almost no one paid any attention to this paper

Sunday, March 26, 2023

The Encyclopedia Universalis Twists the Truth

Alfred Wallace’s scientific curiosity that drove him for over forty years is nowadays considered proof that he was an “eccentric”. In the French Encyclopedia Universalis, esteemed by many as a reliable source, in an entry on Alfred Russel Wallace by Jacqueline Brossolet, "archiviste documentaliste à l'Institut Pasteur", she comments on this curiosity of his in just one sentence:

At the end of his life, he became interested in sociology, anthropology and human evolution: On Miracles and Modern Spiritualism (1875), Studies Scientific and Social (1900)Man's Place in the Universe (1903). In 1905, he wrote his autobiography My Life.

Alfred Wallace’s first publication on the subject of paranormal appeared in 1866 – hardly at the “end of his life”, since Wallace died in 1913! If Encyclopedia Universalis is supposed to represent the position of “mainstream science”, in this case it is doing it well - by ignoring inconvenient facts and distorting the truth. But isn’t that what Science accuses Religion of doing?

We will see that a similar fate has been bestowed upon another curious scientist, a contemporary of Wallace, William Crookes. Therefore we will be justified to be become curious as to whether, by some chance, we are not dealing here with a rule rather than with an exception?



Whom to believe?

Our beliefs do matter – whether we are a scientist or a priest. They influence our choices, conscious and unconscious. Our choices influence our reality; they influence the choices of other people. The “butterfly effect” may be at work, when one flap of a butterfly wing dramatically changes the weather pattern on the planet according to Chaos Theory. Assigning a very small probability to such an effect may depend on our insufficient knowledge of causes and of circumstances.

We have seen that we should not believe Encyclopedia Universalities in everything that we find there. So, whom to believe?

I like to say: “I do not want believe. I want to know.” Isn’t knowledge better than belief? And yet things are somewhat complicated. Why? Because in some cases when I believe that I know, in reality I do not know. On the other hand I may know that I believe that I know which prevents me – to some extent – from making errors.


There is an error in Encyclopedia Universalis – does it mean that we should not rely on encyclopedias in general? Never? But then, what should we rely upon? Experts? Which experts? Experts make errors as well. Experts often do not agree with each other. Sometimes they fight and they fight violently. Example: In the book “The Neanderthal Enigma”, p. 89,  James Shreeve relates a story when, during a conference in Zagreb, one anthropologist, who had dated an archaic sapiens skull to more than 700,000 years, started yelling and charged the podium when another anthropologist (the one speaking at the podium) claimed that he had dated to less than half that age. A third anthropologist had to use physical force to separate the two, one of whom obviously intended to do bodily harm to the other. Similar fights between “experts” are usually hidden from the eyes of the public, yet they do occur, especially during conferences. The public at large, however, is mostly exposed to a relatively stable “mainstream science” point of view, one which, now and then, undergoes dramatic revolutions.

What to do?

So, what should we believe and whom should we believe? Let me quote here the advice given by Bertrand Russell:

There are matters about which those who have investigated them are agreed; the dates of eclipses may serve as an illustration. There are other matters about which experts are not agreed. Even when the experts all agree, they may well be mistaken. Einstein’s view as to the magnitude of the deflection of light by gravitation would have been rejected by all experts twenty years ago, yet it proved to be right. Nevertheless the opinion of experts, when it is unanimous, must be accepted by non-experts as more likely to be right than the opposite opinion. The scepticism that I advocate amounts only to this:

(1) that when the experts are agreed, the opposite opinion cannot be held to be certain;

(2) that when they are not agreed, no opinion can be regarded as certain by a non-expert; and

(3) that when they all hold that no sufficient grounds for a positive opinion exist, the ordinary man would do well to suspend his judgment.

This may seem like a reasonable approach, but should we believe Bertrand Russell? Russell was certainly a first class philosopher, an expert in his domain, but what do other experts have to say on the same or similar subjects? Do experts agree on the subject of believing?

P.S.1. 26-03-2023 18:15  From my reading list: 

1. Time-energy uncertainty and relativistic canonical commutation relations in quantum spacetime

Eduard Prugovečki 

Foundations of Physics volume 12, pages 555–564 (1982)

It is shown that the time operator Q 0 appearing in the realization of the RCCR's [Qμ,Pv]=−jhgμv, on Minkowski quantum spacetime is a self adjoint operator on Hilbert space of square integrable functions over Σ m =σ×v m , where σ is a timelike hyperplane. This result leads to time-energy uncertainty relations that match their space-momentum counterparts. The operators Qμ appearing in Born's metric operator in quantum spacetime emerge as internal spacetime operators for exciton states, and the condition that the metric operator should possess a ground exciton state assumes the significance of achieving minimal spacetime 4-momentum uncertainty in fundamental standards for spacetime measurements.

2. Information-theoretical aspects of quantum measurement

Eduard Prugovečki 

International Journal of Theoretical Physics volume 16, pages 321–331 (1977)

We present criteria for comparing measurements on a given system from the point of view of the information they provide. These criteria lead to a concept of informational completeness of a set of observables, which generalizes the conventional concept of completeness. The entropy of a state with respect to an arbitrary sample space of potential measurement outcomes is defined, and then studied in the context of configuration space and fuzzy stochastic phase space.

3. From E. Prugovecki Biography:

He left for the U.S.A. in 1961 because he had the opportunity to study under Prof. Wightman. Actually, in 1962-63 he helped with proof-reading of the Streater-Wightman monograph PCT, Spin and Statistics, and All That. I mention this because Prof. R. F. Streater was my supervisor during my work on a Ph.D. thesis at Imperial College in London.

At that time Prugovecki thought that the Wightman School stood for ideals to which he strongly subscribed (and still does): the kind of mathematical rigour and basic honesty in science that he found sadly lacking in contemporary theoretical physics, dominated since the mid-1940s by questionable "renormalization schemes" and other techniques that, he personally felt, were doing a lot of harm to a great tradition in science.

In the meantime, his interest in physics had diminished because of an interest in pure mathematics and philosophy of science for which, he thought,  there was no possibility of study at the Institute Ruder Boskovic. At that time and later he was deeply attached to the principles of mathematical soundness and beauty in his work, as always advocated by Dirac. Many quotes from Dirac are used in his last two monographs.

By the mid-1960s it became clear to him that all that Wightman and his followers had to offer was simply another fundamentally unsubstantiated form of dogma, advocated by means of sheer techniques rather than by a truly critical analysis of the foundations of quantum theory. 

So, he decided to move to Canada since he thought that there he would be far enough from the centers of power in US to pursue his own program unmolested, and yet close enough to be able to exert some influence once he began to effectively develop it. Unfortunately, it turned out that he was very wrong in those assumptions. 

Once he began having some real measure of success with his program, things began happening to him. He got the feeling that competition in science in North America was not pursued in the same ethical manner as in Europe. Therefore, he gave me the following sincere and succinct advice for those young Croatian theoretical physicists for whom science is not just another way to acquire influence and power at any cost: stay in Europe!

P.S.2 27-03-2023 10:11

Eduard Prugovecki describes today's Science, as he experienced it, in his futuristic novel  "Dawn of the New Man" (Xlibris 2002)


His description so well fits the realities that I have experienced myself, that I simply can't refrain from quoting the whole two page passage. Here it is:
 
For, fortunately for me, it turned out that I had a brilliant intellect and remarkable academic aptitudes. It also turned out that I could be as single-minded in my chosen pursuits as my father was in his, so that I eventually managed to acquire a doctorate in quantum cosmology while I was still only twenty-one years of age. Hence my father eventually not only became totally reconciled to my not following in his footsteps, but by the time I received my doctoral diploma he could no longer hide his pride in my academic achievements.

THE TOTAL DEDICATION AND intense concentration required to compress course requirements and the most creative type of research, which would normally take ten years or more, into a mere five years, had left me no time while I was a student to observe the mundane features of the academic life that was unfolding around me. But after I had been offered, on my own merits, and without any kind of intervention from my father, a research position at the internationally most distinguished research institution in quantum physics and cosmology at that time, I was suddenly confronted with the everyday realities of life in the world of what had become by that time known as Big Science.

To my utter dismay, I soon discovered that behind the idealistic facade of the world of contemporary science, there lay hidden an incredible amount of pettiness, of short-sightedness, of lack of social conscience, of crass favoritism, and of unprincipled self-promotion. I also discovered kinds of machinations were required for the achievement of professional reputation, and that excellence in research alone was not sufficient nor a necessary condition for professional success.

In fact, the scientific community that was eventually revealed to my eyes had practically nothing in common with that described in the biographies of great scientists that had inspired me in my teens. Instead of honest and objective Pursuers after Truth and Knowledge, by the time I became a scientist myself, the leading figures of contemporary science were by and large mediocrities who had reached their prominent status by using and manipulating their professional contacts rather than on merit alone. Their inconspicuous but all-pervading “networking” was reflected in the exchanges of professional favors; in the acceptance of positions on committees that dispensed research grants, lucrative research positions, academic prizes, and so on; in the shameless advertising and “selling” of their own ideas; in the control of the editorial boards of scientific journals, so that no papers that did not support the often poorly conceived pet theories of the top members in the scientific establishment could get published; and in a plethora of other devices meant to enhance the reputations of those members, and that of the various power cliques to which they belonged.

Those cliques protected and defended what they saw as “their turf” with well-disguised but ferocious determination. For scientific truth no longer mattered. What exclusively mattered was professional glory! And when on rare occasions young idealists like me tried to come forth with worthwhile research that contradicted the ossified and sometimes patently false pet theories of “leading” scientists, thereby threatening their privileged positions, those young scientists soon found out that they could not get their papers published in well-known journals, that they could not get adequate grants, and that if they did not quickly join the conformist crowd of their colleagues, their careers might be ruined by most despicable means, ranging from the denial of well-deserved academic or research positions to downright character assassination.

Dr. E. Prugovecki's Web Pages 


Coming next: Clifford's solution


Friday, March 24, 2023

The Curiosity of Alfred Russel Wallace

A good example is the case of research into the paranormal, and I want to give examples here of two distinguished scientists who were more curious than the rest of the “mainstream” in their time. The first one of these two is Alfred Russel Wallace, known as the co-founder of Darwin’s Theory of Evolution. 



Was he really a co-founder or the originator? This question is still being debated, sometimes fervently In 2006 Roy Davies, who was researching Darwin’s case for the BBC, summarized the story in his book “The Darwin Conspiracy. Origins of a Scientific Crime.

Now, I am convinced that Charles Darwin – British national hero, hailed as the greatest naturalist the world has ever known, the originator of one of the greatest ideas of the nineteenth century – lied, cheated and plagiarised in order to be recognised as the man who discovered the theory of evolution.

Charles H. Smith, Science Librarian and Professor of Library Public Services at Western Kentucky University, created a whole website devoted exclusively to the heritage of Alfred Russel Wallace. After examining all the evidence concerning the extent and the priority of both Darwin’s and Wallace’s contributions, he takes a more careful position:

Question: Did Darwin really steal material from Wallace to complete his theory of natural selection?

Answer: Maybe, though the evidence is something short of compelling.

Alfred Russel Wallace

The fact is that Alfred Russel Wallace diverged from Darwin’s purely materialistic position, and therefore had to be punished by those scientists who took it as their creed that anything that goes beyond materialism is either not worthy of their attention or simply wrong. Alfred Wallace was more curious than most of his colleague scientists and his curiosity were not well received. The result was that, today his works are mostly unknown or ignored.

What happened?

Darwin and Wallace Part Company

Around 1865, six years after Darwin published his celebrated “On the origin of species”, Wallace, who was more open-minded than Darwin, became seriously interested in research on faculties of the human mind, including what is called popularly “supernatural – respecting which Darwin did not show any interest at all. Darwin knew all about it in advance, and he did not want to look at the facts, the subject was simply boring for him.

“I wonder why. I wonder why.

I wonder why I wonder.

I wonder why I wonder why

I wonder why I wonder!”

So wrote Richard Feynman, the famous American physicist and Nobel Prize winner, in his book "Surely You're Joking, Mr. Feynman!" - Adventures of a Curious Character”. Darwin and many others who held important positions in society, industry and politics, did not wonder and did not want to wonder.

Here is what happened.

In 1866, after some experimenting on his own, Wallace published his first book on this subject: The scientific aspects of the supernatural: indicating the desirablenessof an experimental enquiry by men of science into the alleged powersof clairvoyants and mediums.”

He was simply drawing the attention of scientists to the phenomena that he considered important and in need of a serious scientific inquiry. The response that Wallace received revealed a great deal about the level of curiosity that is supposed to be one of the major characteristics of a scientific mind.

Augustus deMorgan, British mathematician who established the foundation for modern logic (de Morgan laws) was one of the very few who responded with understanding. He confessed, in a letter to Wallace, that he observes the reactions of his students to their exposure to the unknown, and by their reactions can tell whether they are “men of science” or not.

Wallace invited a number of respected scientists – including the noted physicist, John Tyndall – to assist in his investigations into psychic phenomena. His idea was to create a new branch of Anthropology. He also invited convinced Darwinist, T.H. Huxley. Huxley refused the invitation writing: It may all be true, for anything I know to the contrary, but really I cannot get up any interest in the subject.” Wallace pressed him and Huxley stated that he’d heard enough of spirit communications to know that they were so much nonsense” and “It’s too amusing to be a fair work, and too hard work to be amusing” (Keep in mind that Huxley was a self-taught biologist in the days when biology was rather primitive, and became one of the finest comparative anatomists of his day. However, he does seem to have had some political motivations. It is said of Huxley that Before him, science was mostly a gentleman’s occupation, after him, science was a profession.”).

As for Darwin, he only once sat in a séance, in 1874 – or almost did. When the affair was about to start, Darwin suddenly jumped to his feet and left the room. "The Lord has mercy on us all, if we have to believe in such rubbish" – he commented. His wife Emma, who was there, explained, "He won't believe it, he dislikes the thought of it so very much”. That tells us about the quality of Darwin’s curiosity – a sine qua non condition, without which Science dies. Perhaps we can better understand now why it is said of him that he “lied, cheated and plagiarised in order to be recognised as the man who discovered the theory of evolution” (Davies, R., 2008. The Darwin Conspiracy. Golden Square, London, p. 162). This raises, of course, the interesting question that might be studied scientifically: Does a lack of curiosity also correlate with a lack of character?

Coming next: The Encyclopedia Universalis Twists the Truth

Thursday, March 23, 2023

Curiosity, intellectual freedom and Science

 Curiosity killed the cat? Perhaps, but lack of curiosity or insufficient curiosity kills Science.



Curiosity in one’s work is incessantly asking questions about “How?” and “Why?”. 


Of course children are doing this all the time, but usually they address these questions to adults, and when they try to answer these questions by themselves, they can be in danger because they do not yet have adequate knowledge and experience to fully appreciate the answers. Scientists usually have both knowledge and experience, but their curiosity may have diminished, sometimes completely. Without curiosity Science dies. 


Scientists who are not curious about the Unknown and the Unexplained infect Science with a dangerous disease. In principle this fact is well known. For instance, in the opening section of a 2007 report of the European Research Council entitled “What Makes Scientists Creative? we could read these words:

Humans are curious by nature and have been seeking knowledge about the universe, our natural environment, our past and future since ancient times. Scientists exhibit a heightened level of curiosity. They go further and deeper into basic questions showing a passion for knowledge for its own sake.

Curiosity is the driving force of basic, or pure, science. The desire to go beyond the established frontiers of knowledge, to explore the boundaries of discipline and to resolve unanswered questions, is motivated essentially by human inquisitiveness. (Italics, mine.)

But it is one thing is to know certain truth “in principle”, and quite another is to notice the cases in which the principle should be applied, but is not.

Curiosity is Not Enough: Freedom is Essential

Lack of curiosity, intellectual indifference and laziness – they all kill Science. But even when curiosity is present, Science will not grow as it could – and should - without total intellectual freedom.

Freedom is essential


Bertrand Russell, was well aware of the dangers for Science coming not only through the political use of Science, but also from within Science itself. He warned us:

Those to whom intellectual freedom is personally important may be a minority in the community, but among them are the men of most importance to the future. We have seen the importance of Copernicus, Galileo, and Darwin in the history of mankind, and it is not to be supposed that the future will produce no more such men. If they are prevented from doing their work and having their due effect, the human race will stagnate, and a new Dark Age will succeed, as the earlier Dark Age succeeded the brilliant period of antiquity. New truth is often uncomfortable, especially to the holders of power; nevertheless, amid the long record of cruelty and bigotry, it is the most important achievement of our intelligent but wayward species. (Italics, mine.)

New truth is, indeed, often uncomfortable and the history of Science tells how scientists who were asking questions “why?” and “how” have been treated. Learning about how such processes took place in the past enables us to be more sensitive to very same things happening now, all around us. The battle of the Titans still takes place between Science and religion, but it also can be seen within Science itself.


P.S.1. Out of curiosity I have recently watched the 1987 movie "Fatal attraction". 

Michael Douglas portrays Daniel "Dan" Gallagher

The movie shows how uncontrolled curiosity and losing your scientifically objective moral compass can lead to disasters almost as bad as nuclear war.

Tuesday, March 21, 2023

Wrong use of Science

 Before looking into what happens within Science itself, let discuss some of its uncanny consequences. It is easy to be a devotee of the scientific method; it is not so easy to also be a consciously responsible scientist. The history is well known: dynamite, invented by Alfred Nobel, became a building block of weapons used for killing people. The discovery of radioactivity led to the buildup of nuclear arsenals. The bombs stockpiled today are sufficient for destroying all life on our planet. Artificial Intelligence 



and  Bioengineering can, potentially, lead to a similar catastrophe.

Darth Maladifemale Devaronian member of the One Sith,
was skilled at bioengineering.

Bertrand Russell On the Destructiveness of Science

In “Religion and Science”, Bertrand Russell summarizes it this way:

The scientific temper of mind is cautious, tentative, and piecemeal; it does not imagine that it knows the whole truth, or that even its best knowledge is wholly true. It knows that every doctrine needs emendation sooner or later, and that the necessary emendation requires freedom of investigation and freedom of discussion. But out of theoretical science a scientific technique has developed, and the scientific technique has none of the tentativeness of the theory. Physics has been revolutionized during the present century by relativity and the quantum theory, but all the inventions based upon the old physics are still found satisfactory. The application of electricity to industry and daily life—including such matters as power stations, broadcasting, and electric light—is based upon the work of Clerk Maxwell, published over sixty years ago ; and none of these inventions has failed to work because, as we now know, Clerk Maxwell's views were in many ways inadequate. Thus the practical experts who employ scientific technique, and still more the governments and large firms which employ the practical experts, acquire a quite different temper from that of the men of science: a temper full of a sense of limitless power, of arrogant certainty, and of pleasure in manipulation even of human material. This is the very reverse of the scientific temper, but it cannot be denied that science has helped to promote it.

The direct effects of scientific technique, also, have been by no means wholly beneficial. (…) they have increased the destructiveness of weapons of war, and the proportion of the population that can be spared from peaceful industry for fighting and the manufacture of munitions.(…) These evils of our time are all due in part to scientific technique, and therefore ultimately to science. (bold, mine.)

Although what Russell is saying is to the point, yet it does not tell us the whole story. It should be added that the scientific revolution carried within itself certain undesirable effects that, instead of diminishing, are growing larger and more dangerous in our time. In the seventeenth and eighteenth century’s scientists were often well educated members of the upper class who had a great deal of leisure and plenty of money to support their research. This was the Golden Age of Science. Later, in the nineteenth century, economic and political consequences of certain discoveries started to put Science on a wrong path. The pursuit of science became a career rather than a hobby. An army of scientific workers was sought to serve the agendas of what was to become known as the Military-Industrial Complex

In short: Zeus overthrew Cronus and the Golden Age of Science came to an end.

Coming next: Curiosity, intellectual freedom and Science


P.S.1 22-01-2023 Reading McGilchrist "The Matter with Things":

"In our ordinary way of thinking, things must be established before there can be relationships, and so this about-turn should seem paradoxical; but as I shall explain, paradox very often represents a conflict between the different ‘takes’ afforded by the two hemispheres. However, we must also be prepared to find that, as Niels Bohr recognised, whereas trivial truths manifestly exclude their opposites, the most profound truths do not."

" This is itself a version of the realisation that what applies at the local level does not necessarily apply in the same way at the global level. The failure to observe this principle underlies some of the current misconceptions of both science and philosophy.

I believe that nowadays we live no longer in the presence of the world, but rather in a re-presentation of it. The significance of that is that the left hemisphere’s task is to ‘re-present’ what first ‘presences’ to the right hemisphere. This re-presentation has all the qualities of a virtual image: an infinitely thin, immobile, fragment of a vast, seamless, living, ever-flowing whole. From a standpoint within the representation, everything is reversed. Instead of seeing what is truly present as primary, and the representation as a necessarily diminished derivative of it, we see reality as merely a special case of our representation – one in which something is added in to ‘animate’ it. In this it is like a ciné film that consists of countless static slices requiring a projector to bring it back into what at least looks to us like a living flow. On the contrary, however, reality is not an animated version of our re-presentation of it, but our re-presentation a devitalised version of reality. It is the re-presentation that is a special, wholly atypical and imaginary, case of what is truly present, as the filmstrip is of life – the re-presentation is simply what one might call the ‘limit case’ of what is real. Stepping out of this world-picture and into the world, stepping out of suspended animation and back into life, will involve inverting many of our perhaps cherished assumptions."

"Straight lines, in as much as they can be said to exist at all, do so as the limit case of curves, which constitute all the lines in nature even space and the paths travelled in it are curved). Linearity is the limit case of nonlinearity, and can be approximated only by taking ever narrower views of an infinitely complex picture. The discontinuous, in as much as it can be said to exist at all, is the limit case of the continuous, which is the norm. Total independence is an imaginary construct, the limit case of interdependence, which is universal."

"Let me give a few further examples, which I grant may seem at first sight surprising, even nonsensical. We could start with our own thought processes and their expression in language. The explicit is not more fully real than the implicit. It is merely the limit case of the implicit, with much of its vital meaning sheared off: narrowed down and ‘finalised’. The literal is not more real than the metaphorical: it is merely the limit case of the metaphorical, in which the wealth of meaning is collapsed into a 1:1 correspondence for a useful, temporary, purpose. More importantly, it’s the wider cosmos whose deep structure we are inclined to misunderstand. It may seem obvious that randomness is the primary condition and that order is an unusual phenomenon that emerges from (how?), and is supervenient on, that primary chaos. However, order is not a special case of randomness, but randomness merely the limit case of order, which is the universal norm. Indeed, true randomness is a theoretical construct that does not exist." (bold, mine)

And so I think. And I know what needs to be done. We are consciously aware only about a thin boundary of the Infinite Domain

Spin Chronicles Part 27: Back to the roots

  We have to devote some space to Exercise 1 of the previous post .  Back to the roots The problems was: Prove that <ba,c> = <b,ca...