The `story,' how (and why) it happened, for those who are curious. For others: index.htm
Asymmetric aging; Twins Paradox
First some orientation. Complete scientific theories would be like Euclidean Geometry. There would be some initial definitions, which in Math are called `primitive elements.' At best, they are intuitively understandable and roundly recognized. Then, a set of axioms is given specifying the structure that these primitive elements fit into. Thereafter, using syllogisms, additional ``truths,'' that is verifiable statements (relative to the axiom set), are deduced. Good! But the fact is, this scheme doesn't fit science! The real primitive elements are unknown (are they: fields, particles, electrons, quarks, gluons, etc. etc. ?); and, the axioms are unknown. The axioms would be the fundamental theories that the whole enterprise is striving to divine.
So, fundamental Physics is the business of trying to work backwards to identify the elements and axioms. With much agony along the way, it has been empirically deduced that the optimum way to proceed with this endeavor is to use `observation' (in the form of careful experimentation) and `logic' (mostly in the form of correct mathematics). Up to the beginning of the 20th century, this worked just fine, for the most part. There were some technical points obviously not in order in electrodynamics, but otherwise the theories of the day seemed to be complete-able in short order.
As is well known, work on some of these small defects, e.g., the discreet nature of adsorption or emission spectra, lead to the inception of QM, at which point, enthusiasm over early, and relatively easy, success was accompanied by relegating the remaining other problems in what is now known as `classical physics,' to the back burner, for which shortly thereafter, the gas was turned off! To the extent that any attention at all to these old problems was paid, it was said that the new structure would surely show the way to resolving them also, as a kind of collateral fall-out.
This hope seems to many to be fundamentally flawed. In essence, introducing QM just expanded the would-be axiom set of basic physics, without removing any of the extant axioms (so that the correspondence principle can function). Therefore, the contradictions or inconsistencies of the existing axiom set were not evaded; adding to this set just makes it bigger with the same inconsistencies; and, indeed, all the pathologies of, for example, electrodynamics, reappeared, actually in a worse form in quantum electrodynamics as divergences, etc. They remain today, happily still in place! In short, old problems are worthy of intense attention.
It was with all this in mind, that this research program was undertaken. As a concrete matter, this effort has focused on two features characterized by goofy conceptualism and faulty math: non-locality and asymmetric aging. Below, the progress made attacking these `loco' ideas will be delineated in some detail.
As kind of a side-bar, some ruminations
on the sociological aspects of this type program:
Sociology (soft ideas) comments.htm
Non locality is considered nowadays as an intrinsic quality of Quantum Mechanics as interpreted by the bulk of the so-called `physics community.' It was noticed quite early---Einstein is typically given credit---that the notion of 'wave function' vis-a-vis observed behavior of electrons, at least, must exhibit `collapse,' which is an intrinsically non-local event as a physical occurrence. This point he illustrated circa 1927 by consideration of the wave function for a particle beam passing through a very small hole. On the back side of the hole, the beam will be an expanding hemispherical wave which can be directed onto a hemispherical detector, where finally, beam particle impacts will be registered as point flashes. If now the wave function is the true and deepest essence of reality, it must have been finite over the whole surface of the detector the instant before impact, but instantly collapses to a point exactly at impact, logically even faster than the speed of light. Years later, it is now widely believed, that John Bell proved with full rigor, that indeed QM is ineluctably non-local (more on this below). Thus, this matter can be subdivided into two issues: wave-function identity and Bell's analysis.
Wave function identity:
What it is not.
At the start of this research program, 1972 say, it was widely appreciated that something was ill understood about QM. (Although in those days intense psychological warfare techniques were routinely employed to convince new students of QM that misunderstanding here was evidence of outmoded thinking habits. This persists today, although weaker. Obviously, this behavior is a compensation maneuver.) My motivation, or research idea, was to use the then newly available computer power to investigate by means of simulation (then known as Monte-Carlo techniques) quantum diffusion. It did not work. All these efforts resulted only in demonstrations of the central limit theorem. This disappointment led this writer to both extensive literature searches and reflection.
The first thing that became clear was, that diffusion processes are governed by a parabolic differential equation, while on the other hand the Schroedinger equation, despite involving only a single time derivative, is a hyperbolic differential equation; because the factor of "i", which is equivalent to another time derivation. This single observation is fatal for all attempts to associate diffusion or Gaussian stochastic processes with whatever underlies QM. Eventually, this point found an outlet as a publication, 3., as a comment on a paper by Nelson, whose work is very widely thought, even today, to show that in fact there is a connection to diffusion. Rumor has it, that Nelson himself also nowadays considers that his conceptions encompass covert nonlocality.
What it can be:
Perusing literature circa 1973 turned up several papers ``deriving'' the Planck blackbody without Planck's quantum hypothesis. Instead, the basic input notion was, that it results from interaction with an electromagnetic background of a particular character. While these papers had nothing to say about the essential features of QM, that is, about waves, interference, and deterministic hyperbolic evolution, nevertheless, in so far as they spoke to the very issue that kicked off the development of QM in the first place, it seemed not unreasonable to question if this basic input could be extended to cover those wave aspects of QM too.
Other clues (since lost) found in the literature, led to the notion that deBroglie waves have structure in common with standing waves, not traveling waves. This in turn suggested concepts that were then mulled over and elaborated around seemingly endless dead-ends, until it lead to a "model" of QM processes presented in a series of papers, based on the idea of background-supported pilot waves. Although each of these papers contains what is intended to be a unique nugget of insight, they are largely redundant---reading only the last suffices to get the picture.
The final result of the development of this paradigm for QM, is still incomplete. However, among the alternatives, all bad to various degrees, it can be argued reasonably, that it is the best. It solves Feynman's conundrum (how to understand particle beam deflection without non-locality or other mysteries), but in the end it too is based on an untenable proposition, namely that there is essentially an infinite amount of background energy available at every point in space.(Some proponents of ``zero point'' background fields seem charmed by the image of the `vacuum' actually being a `plenum,' or sea of positron-electron soup boiling fiercely away! Such flights can just as easily be seen as more of the mystical hocus-pocus, anti-scientific excesses so inimical to, say, sewer construction. All play and no work.) In short, as it now stands, this paradigm tidies up the scenery by concentrating all the disparate inconsistencies and mysteries of QM to one single point, this infinite energy. So, as the best of a bad lot, it awaits completion.
In shameless exploitation of the reputation of mathematics as logically (if correct) irrefutable truth, a certain `demonstration' by John Bell has been rebaptized as a `theorem.' In fact, it is neither a theorem, nor so denoted by Bell himself. It is an argument whose validity depends on its hypothetical suppositions, both explicit and covert. It turns out, it is not hard to show that Bell's analysis contains suppositions, however reasonable in appearance to a cursory look, that are absolutely untenable. The main one, if fact, is a rather pedestrian math error, facilitated by faulty notation.
Unraveling these realities in the course of this research project involved several stages of insight. The challenge always was to determine exactly what the symbols (and words) used by Bell and his disciples were supposed to mean. This is no small task. This writer himself, in spite of being convinced from the moment he first learned of ``Bell's theorem,'' that its final conclusion must be wrong, persisted in overlooking rather simple defects. This occurred by cause of a mix of probability theory and relativity (or electrodynamics) that define the idea of ``locality.'' In early literature, the definition is given (mostly only implied) in terms of the factorability of probabilities. In many of these early discussions there is pervasive obscurity regarding the interrelationship of `statistical independence' and `causality.' To further obfuscate matters, many discussions were formulated so as to admit the possibility of causal connections, i.e., those for which causes strictly precede effects, by mysterious `quantum interactions' independent of electromagnetic interactions.
In the course of these researches, clarity was brought to the issue only through the back door, as it were. The first point to become clear, was that the data from EPR experiments testing Bell's analysis, involved the assumption that the statistical dependence of the outcomes of an EPR experiment, right and left, was mixed up with the interpretation of the wave function, although the `theorem'' itself nowhere mentions specific quantum-like aspects. Further, mathematical sloppiness in the failure to deal with the distinctions between discrete, usually dichotomic data points, and continuous, even harmonic, functions, as limits of densities of data points, was, to me and some others, alarming and noticed.
The latter point seemed particularly vulnerable to abstract attack; in fact, while listening to another champion of non-locality at a conference on the foundations of modern physics, this idea fell into place, and in seconds yielded the argument, eventually published in 3. Getting this published was the occasion of a lesson in the realities of institutional physics. This argument is bloodless mathematics. It can be totally dissected with logically rigorous arguments beyond dispute. The reaction, however, from PRL referees and editors was utterly shameless in its cynicism and lack of conscientiousness. There was no limit to how low, actually stupid, they would go. Even simple arithmetic errors (theirs, not mine) were proffered as reasons to reject it.
Naturally, this experience motivated an effort to go beyond logically sufficient towards sociologically overwhelming arguments. Successes (at research, not publication) followed rapidly. To some extent, this development is reflected in submissions to the arXiv. The first, and best, breakthrough was a flash of insight obtained while rereading Feller's text on probability theory. While there was nothing new and unknown in it, the sudden realization that Bell's notation probably mislead him, just as it had until that time mislead me, regarding the meaning of the symbols in his fundamental assumption. It turns out, Edwin Jaynes was the first to spot this error.
Once this understanding was achieved, it was obvious to me that there had to be a similar flaw also in the Kochen-Specker variants of Bell-type arguments (including Mirmen's rendition). The original versions of KS `theorem' are very opaque; just plain complex. But Mermin's version is simple enough and close enough to a physical application, that the error in reasoning: i.e., applying results that are not simultaneously possible on physical grounds, literally pops out into one's eye as soon as one ceases to cower before formalism.
If Bell was wrong, then it must be possible to mimic EPR experiments with fully local realistic models. Some of the very first versions of such a model were, although technically correct in my view, not fully faithful to the physics of actual experiments, and therefore not fully satisfactory. Nevertheless, the story as developed up to circa 1998, finally found an outlet in 12.
For anyone with a nose for nonsense, `teleportation' is outrageously stinky. The controlled conversion of matter to radiation, then back to matter, defies, inter alia, the second law of thermodynamics, according to which progressing physical processes will always lose information to the surroundings. Thus, were `Scotty' beamed-up, the reconstituted Scotty would have to be less than the original, making Scotty in short order a ghost. Introducing this term into physics, as an adjunct to the already loco notion of non-locality, therefore, screams out for sober attention.
On the instinctive basis that the inherently nonsensical assumptions in explanations of experiments billed as exhibiting `teleportation of photons,' would jump out in a calculation, some thought and programming were given to exactly this task. Coincidentally with completion of this task, there appeared a paper on a new experiment involving 4-fold coincidence analysis, that by reason of symmetry, simplified the math involved. It was a simple matter to apply the just completed calculation algorithm to the just appeared paper on the arXiv and write it up. As it turned out, this rejoinder write-up nearly was submitted before the experimental paper itself was submitted to PRL! Oh veh! the circus that resulted from submission of the rejoinder to PRL. Originally rejected on nothing more than the reputation of my address, rebuttals were then conscientiously considered---so long as the full meaning was not understood by the editorial staff at PRL! Once at all clear to them what it actually meant, however, again no excuse was too sleazy to use to justify rejection.
However, the argument is valid, rigorous and simple. It found support among several experimentalists, who know the phenomena from hands-on experience, not from sorting symbols or abstract gymnastics. It was published in 16. See also 17. for a specialization to the ``Franson'' type experiments. Clear thinking will not be fostered indulging this misnomer, `teleportation'; what the experiments demonstrate is `coincidence filtering,' not teleportation. Nothing is ported across any `tele' (`distance' in Greek), not even information (except in the classical channel, as telephoning, hardly a big deal!). See: 16.
The papers mentioned above, i.e., 16. and 17., provide local realist models and calculations of EPR and GHZ experiments. These models all serve their purpose of being, at least technical, counter examples to the conclusion of the so-called ``Bell theorem,'' namely that any such model is impossible. Still, they can be criticized for lacking insight into exactly how the physics works. For this purpose, a better model would be perhaps a detailed simulation that marches through the experiment event by event.
On the basis of the above results, programming a simulation was expected to be, at least in principle, easy. Nevertheless, the devil in the details, and the necessity to learn to deal with the syntax of new computer codes, provided some challenges, eventually overcome and published in: 22 and 23.
Asymmetric aging (time dilation) and its partner, Lorentz-Fitzgerald contraction, can be seen as astonishing physical phenomena, or as space-time perspective effects. If they are the latter, then they present little to worry about. Nevertheless, they have vexed the development of theoretical physics for a hundred years, especially relativistic mechanics, via inter alia, no-interaction theorems.
One can say the the fault accounting for the fact that QM has not developed in an easily acceptable form, is lack of micro-models for the underlying physical processes. Failure by the founders to find such processes, induced them (actually mostly the following generations), in a fit of hubris, to attempt even to ordain that there are no such processes; i.e., that QM is ``complete;'' their's was the final word. It is to be taken virtually as `divine revelation.' Every single feature of it must be accepted ``as is.''
One of those features, found by trial-and-error in the first place, and ensconced in the formulation of nonrelativistic QM, is, that position variables correspond, for incomprehensible physical reasons, to operators; while time is a parameter conjugate to the Hamiltonian that determines the dynamics. Thus, time and space do not have the same status in QM, contrary to the spirit of relativity in the Minkowski formulation. This led to the curiosity, that whereas there is a nonrelativistic multi particle wave equation, there is no relativistic version. The best relativistic quantum theory can do, is give an equation for a single particle bathed in a field.
Efforts to evade this circumstance by various abstract attacks using group theory and the like, got nowhere; instead, they resulted in several `theorems,' ostensibly proving that any Hamiltonian formulation of relativistic dynamics, quantized or not, can not incorporate `interaction' between particles. In the 1960's and 70's such theorems appeared in the literature regularly. A result that seemed to this writer as very suspicious, because obviously in nature particles are interacting, and Hamiltonian Mechanics is really just gussied up differential equations, which work just fine for continuous physical progressions.
Moreover, this writer was aware of Cartan's formulation of (Hamiltonian) Mechanics, which is so abstract, that it is valid before the imposition of a metric, (even the Lorentz metric) and which admits formulations in any dimension (say, the 4XN configuration space of N interacting particles). Since this can be seen as an existence proof of exactly what these no-go theorems seemingly preclude, the situation cried out for reconciliation. After some effort, the matter did in fact fall into place, for which results appeared in 1, 4, 5, 10, 14 and 21.
AAAD Mechanics and Asymmetric Aging
The exigencies of life (mine anyway), put this line of investigation aside for 20 years, at which time discussions with a son, whose struggle with the conundrums of relativity, brought these questions back into focus. Doodling around with Minkowski charts for the intended purpose of making the twin paradox look reasonable, seemingly by accident produced a version that clearly implied that there should be no asymmetric aging! No amount of scratching and searching revealed an error in the logic apparent in this `doodle'! What's worse, I found that the logic of the doodle was already contained in my very own papers---although manifestly in a form that was not understood fully at the time. That is, in 6., the notion of the so-called ``Case II'' is a full blown repetition of the error in thinking leading to asymmetric aging in the first place.
Efforts at publishing these thoughts in wide circulation professional (using this adjective only in its sociological sense) journals, elicited a predictable response. No referee found it necessary to even mention the graphic, the heart of the argument! Having been cooped into endorsing an antinomy, evidently their very self respect demands that heretics be banished! Just: kill the messenger!! Anyway, dear reader, judge for yourself; or challenge your local physics guru to put his finger exactly on the error in the graphic. But, be prepared for a possible ad-hominum outburst!
This webpage is, consequently, dedicated to all those who, of course, know better than I, and generously intend to set me straight---as soon as they find time for these pedestrian issues!
Back to start: index.htm