SRP LOGO Published Articles Soft Collision Side
Français
English
English
Deutsch

SRP Air Tables How to Order Books The Science Journal The Minkowski Institute International Council of Academics or Progressive Education

Neurolinguistics and Fundamental Physics

Neurolinguistics and Fundamental Physics

Other published articles and presentations


Presentation in plenary session on July 7 of 2000
at congress CONGRESS-2000
held at the ST Petersburg State University, Russia

(Introduction to the paper titled
"On an Expanded Maxwellian Geometry of Space"
presented at the congress [page 291 to 310 of the proceedings])

From my perspective, a physical theory must be elaborated exclusively from experimentally obtained information, like for example, Newton's Theory, Maxwell's Theory, and Planck's concept of the quantum of action.

(1- Identifying the restricted set of stable particles)

In the course of the past 100 years, all massive, elementary and stable particles that can be scattered against non-destructively, namely, the electron, the positron and quarks Up and Down , as well as all of their physical properties, have clearly been identified.

The observation that gravitation has not yet been explained, in spite of these verified findings, led me to suspect, like many others, each for a variety of reasons, that something very fundamental may have been misunderstood or neglected in fundamental physics.

This suspicion caused me to reconsider the accepted space geometry and to carefully re-examine what properties of particles had positively been verified, an exercise that led me to elaborate this theoretical solution, which is based exclusively on the objectively verified properties of stable fundamental particles.

(2- About the loss of interest for the temporal sequences of events)

Retrospectively, I find that the acceptance as physical theories instead of as simple handy mathematical tools, of Quantum Electrodynamics, which introduced the idea of virtual photons, and its direct offshoot Quantum Chromodynamics, which extended the concept of virtual photons to virtual particles, was a determining factor in the neglect during the past half century, of the importance of the Coulomb interaction at the fundamental level, because it generalized the perception that pseudo-quantized virtual entities could physically represent the Coulomb potential which is progressively induced between real particles during scattering and high energy collision events, as an inverse function of the square of the distance between the interacting particles.

I also believe that the general acceptance of the static Lagrangian method instead of the dynamic Hamiltonian method as suggested by Feynman in the framework of his definition of QED, has been instrumental in a loss of interest for the fact that scattering and collisions between particles are precise temporal sequences of events.

Feynman's conclusion that the use of the Hamiltonian was forcing the adoption of the field viewpoint rather than the interaction viewpoint is unfounded in my view, because it can easily be argued that a relative interaction between particles can be nothing other than simultaneous and mutual, just as in the field viewpoint.

If, as Feynman suggests, the Coulomb interaction is mediated solely by an exchange of virtual photons between the particles involved, the following questions jump to my mind:

(3- About the fact that any interaction involves both a force and some energy)

I see an obvious causality problem here, because a mediation of the exchange by virtual photons as proposed by Feynman mixes two fundamentally very different aspects of the relation between particles:

There is also the problem that virtual photons are by definition discrete quantities which seem to imply that the potential is induced in discrete increments between the particles, which is in direct contradiction with the fact that the quantity of energy of motion is progressively induced at any distance as a function of the infinitesimally progressive inverse square law of the distance.

(4- About the importance of the temporal sequences of events)

Scattering and collision events not actually being physically instantaneous, I also think that there are good reasons to question Feynman's opinion when he declares, and I quote:

"In many problems, for example, the close collisions of particles, we are not interested in the precise temporal sequence of events. It is of no interest to be able to say how the situation would look at each instant of time during a collision and how it progresses from instant to instant." ([6], p.771)

Needless to say that I deeply disagree with Feynman on this issue, because this research philosophy has induced the respectful following generations of physicists to refrain from exploring the only remaining unexplored frontier in fundamental physics for the past 50 years:

A further argument supporting the view that virtual photons cannot be a physical reality, is Bohr's own observation that the quantized states of atoms being stable, these states can be accompanied by no radiation whatsoever, and that the existence of such non-radiating states is conform to the idea of quantic stability ([1], p.134). I also fully agree with deBroglie that the existence of quanta implies an inferior limit of a very special nature to the perturbations that can exist in the systems considered ([1], p.20).

Consequently, the perception that radiation could be emitted by electrons on rest orbitals, for example, by means of virtual photons or otherwise, appears to be in direct contradiction with the very foundation of quantum mechanics, because Bohr's and deBroglie's observations imply that some threshold of local excess intensity of energy relative to the local electromagnetic least action equilibrium level must be reached or exceeded before a photon can be emitted.

Between local least action equilibrium and this intensity threshold, the excess energy can only cause local oscillation but when this threshold is reached, conversion to photon state initiates and evacuation from the system considered of the energy locally in excess will allow local equilibrium to be reestablished.

In view of the possible existence of such a relative quantization threshold, the physical existence of virtual photons as mediators of an interaction which results in the induction between the particles involved, of an energy unquantized by definition, simply due to the fact that its local relative intensity has not yet reached that threshold, appears highly questionable except as a convenient mathematical artefact.

(5- About the need to redefine the fundamental space geometry)

The space geometry that must underlie Maxwell's Theory simply requires that a magnetic field and an electric field orthogonally intersect each other while both of them intersect at right angle the direction of motion of electromagnetic wave front in three dimensional space.

Given that if it existed as such, such a wave would mandatorily be expanding spherically in space from its point source, the possibility came to me that at the very point source, on the assumption that the wave coming into being already possessed all electromagnetic characteristics even at this point source, then the orthogonally intersecting magnetic and electric fields envisioned by Maxwell could well not be residing within our tridimensional space at this very point source.

Given that is seems impossible to visualize more than 3 dimensions at a time, I then took to the habit of mentally folding the 3-dimensions of normal space as if they were the ribs of an open three-rib metaphorical umbrella in my attempts to more easily visualize the space geometry then required. This allowed viewing the whole geometry as a normal 3-axes Euclidean geometrical system, the z-axis associated to the magnetic field, the y-axis, to the electric field, and the plane determined by the z and y axes moving at the speed of light along the X-axis, which represents our normal 3-D space.

Having applied the "umbrella" idea to the normal space axis, the further step of extending the idea to the other two axes was easily taken, thus defining an intriguing new geometry of three orthogonally coexisting spaces, each internally possessing 3 dimensions, a metaphorical mental "Rubik's cube" that I became very fond of playing with, mentally opening and closing the umbrellas one at a time as needed to continue being able to easily visualize the whole geometry. Of course, this mental opening and folding of "umbrellas" has no impact on the real spaces that would be represented. They are permanently open and fully extended at all times.

It is to this space geometry that I then undertook to relate all verified properties of elementary scatterable particles, each of which being the focus of a local occurrence of intersection of these three spaces, and each being separated from all others by total vacuum.

(6- About de Broglie's photon dynamic internal EM structure)

The actual element of information that triggered the chain of reasoning leading to the solution proposed here, is a conclusion by Louis deBroglie that photons must be constituted, not of one corpuscle, but of two corpuscles, or half-photons, that would be complementary like the electron is complementary to the positron.

I eventually came to clearly visualize in this expanded space geometry a possible mechanics of how a photon with energy 1.022+ MeV could convert to an electron/positron pair as experimentally confirmed in the 1930's. Eventually, a plausible mechanics of interaction of electrons and positrons took shape that provided a key to understanding how protons and neutrons could come into being.

The end result was a seamless series of clearly defined interaction sequences that provides an uninterrupted path of causality from the unquantized quantities of kinetic energy induced between particles through Coulomb interaction, to the quantization of that energy in the form of photons, to the creation of electrons and positrons from the destabilization of photons of sufficient energy, and finally to the creation of protons and neutrons from the interaction of electrons and positrons when they are forced in groups of 3 including both types into sufficiently small volumes of space.

Such a 9-dimensional 3-spaces local geometry is in my view, the most restricted reference frame that still allows the elaboration of such a clearly defined causality sequence.

(7- About the harmonization of the established theories)

The most surprising outcome however, appears to be a confirmation of Newton's Gravitational Theory in a more precise relativistic form which, by replacing the Newtonian concept of "point-like particles" with that of "charged point-like particles", provides an alternate explanation to that of General Relativity of the Newtonian error in the calculation of the perihelion advance of Mercury, of the correct calculation of gravitational deflection of photons trajectories by stars, of the increase in frequency of cesium atoms in cesium clocks with altitude, provides a gravitational solution to the observed unexplained anomalous constant residual acceleration directed towards the Sun and unexplained rotation anomaly observed for the Pioneer 10/11 ([7]) and ([38, p.23]), and the unexplained anomalous planetary flybys of Galileo, Ulysses and other spacecrafts, all of them related to inertial hyperbolic trajectories, and finally provides a workable solution to the problem of the corona excessive heat.

Surprisingly, this solution draws a natural bridge between Maxwell's electromagnetic theory that it confirms in a manner allowing it to directly describe photons, Coulomb interaction and Newton's gravitational theory upgraded to relativistic status, and refocuses in a new perspective the bulk of accepted orthodox theories, namely Special Relativity, Quantum Mechanics, Quantum Electrodynamics, as well as many of the postulates that are now taken for granted.

(8- About the unresolved issues)

A number of unresolved issues will be addressed in this paper. For example, DeBroglie's important conclusion regarding a possible internal structure for photons, which, in conjunction with Abraham and Kaufmann's discovery regarding the insensitivity of unidirectional kinetic energy to any force applied transversally, seems to be the very key to building the last missing causality link between the kinetic energy that accumulate by means of electromagnetic acceleration of particles and the energy that quarks up and down have to be made up of.

Special Relativity on its part has not yet been adapted to account for the internal adiabatic contraction and expansion of complex particles such as protons and neutrons as a function of the local intensity of electrostatic interactions between elementary charged components of surrounding matter (up and down quarks) and of the impact of this interaction on the local rest mass of these complex particles as a function of the local density of surrounding matter (the local intensity of gravity). SR still deals with protons and neutrons as if they were rest mass invariant elementary particles!

Could this be why no one can currently properly calculate the trajectories of the Pioneer 10 and 11 space probes, even with the General Relativity equations, a theory that supposedly is the final word on all observed inertial gravitational phenomena? Moreover, data gathered for other spacecrafts definitely hints at the possibility that this "assumed anomalous" acceleration phenomenon would be systematic and due to some aspect of fundamental reality not yet covered by traditional theories.

Presently, GR has persistently been researched to no avail for the past decade for ways to account for two distinct socalled anomalies observed regarding the Pioneer spacecrafts, one pertaining to a seldom documented socalled "anomalous" loss of angular momentum about their spin axis, and the other pertaining to a socalled "anomalous" acceleration directed towards the Sun as they escape the Solar system on their hyperbolic trajectories.

Could these failings of GR be due to the fact that SR (to which GR is intimately associated) does not yet take into account the relativistic implications of the fact that protons and neutrons are not elementary, and that their momentary rest mass may well depend on local electrostatic force intensity dependant velocity of the quarks making up their structure? This model will put in perspective how the effective rest mass of complex particles can be correctly integrated.

Other very well documented phenomena that SR and GR are unable to explain are the fact that the Moon is progressively receding from the Earth at a rate of about 3.8 cm per year and that the earth's rotation is higher in summer than in winter and that it is steadily decreasing from year to year, in a manner that no current theory can explain.

We will study how these phenomena can possibly be explained for the same fundamental reason that explains the apparent anomaly in the trajectories of the Pioneer 10 and 11 spacecrafts and the slowing down of atomic clocks. The new model predicts in fact that this socalled anomaly is no anomaly whatsoever, but a normal behavior of all small bodies moving in space.

It is well known that at short gravitational range such as the distance between Mercury and the Sun, GR has proven to be more precise than Newton's classical theory.

Both theories however have been proven to work out the same results for all translation motion of all "observable matter" at galactic and intergalactic range at which all planets and stars bodies behave as if they were point-like relative to each other, for both luminosity and virial theorem methods ([23, p.389]).

(9- About observable matter vs "dark matter")

A note of interest here regarding "observable matter" at galactic and intergalactic ranges concerns the socalled "dark matter". Let us clearly understand that "Observable matter" at these ranges is associated exclusively to that matter whose luminosity is detectable from these far galaxies.

In 1933, astronomer Fritz Zwicky observed that the mass of a cluster of far galaxies calculated from its luminosity compared to the mass of the same cluster calculated by a different method gave a much larger figure with the latter method (the virial theorem) than could be estimated from the luminosity alone.This observation gave birth ot the theory that "dark" invisible matter must exist to explain the difference.

From the get go, it is easy to observe that most of the matter in the solar system, from asteroids, planets, interplanetary particles and debris of all sorts captive of the Sun's magnetic field, up to and including the Oort cloud and interstellar dust and debris clouds of all natures that accumulated since the beginning of the universe, do not emit light but can only reflect some light coming from the Sun. The mass all of that non light emitting matter would also appear as not being computable from the luminosity of the Sun, not speaking of the masking effect that it would contribute by interfering with the Sun's light if the solar system was observed from interstellar distances.

Given the age of the Universe and the constant ejection of matter from stars and stars' coronas from their birth on, doesn't common sense at least suggest that the greater part, if not all of the socalled missing mass (as calculated from luminosity) could simply be this very normal type of matter, undetectable at such distances simply because it is not hot enough to emit light?

But more exotic theories seemed much more appealing from the start to the community at large however, and so tended to prefer postulation of the existence large quantities of some hypothetical abnormal, unknown, undetectable and all pervading extra "dark matter", and to postulate the existence of just as hypothetical undetectable "strange or dark energy" to "explain" the discrepancy.

Shouldn't the fact that none of the exotic particles that were conjured up in theory to replace normal matter as composing "dark matter" are nowhere to be found on the Earth nor in the Solar system but would be plentiful far away where it is impossible to detect them be a telltale that they may not exist at all?

(10- Considering a simpler solution)

Hasn't time come to return to simple and logical explanations and to explore these avenues that are much more promising in concrete results? Let us see where following the lead provided by the really existing scatterable particles that are really detectable here on the Earth will lead us.

We will also see why the Higgs boson whose existence is postulated to explain the existence of mass is not even required, and that a much simpler explanation directly stemming from simple energy inertia and electromagnetism completely justifies the existence of mass.

REFERENCES

· [ 1] Louis de Broglie. LA PHYSIQUE NOUVELLE ET LES QUANTA,
France, Flammarion, France 1937, Second Edition 1993, with new 1973 Preface by
L. de Broglie

· [ 6] Richard Feynman. Space-Time Approach to Quantum Electrodynamics,
Phys. Rev. 76, 769 (1949).

· [7] Anderson, Laing, Lau, Liu, Nieto and Turyshev. Indications from
Pioneer 10/11, Galileo, and Ulysses Data, of an Apparent Anomaleous, Weak,
< Long-Range Acceleration, grqc/ 9808081, v2, 1 Oct 1998.
Abstract
and Table of
Contents
Abstract and Table of Contents of the Paper

The complete set of articles published in the "Electromagnetic mechanics of elementary particles" project are available here:

INDEX - Electromagnetic Mechanics of Elementary Particles



Top Of Page