This page examines the notion of "Process Physics" as defined in a substantive paper Process Physics: From Information Theory to Quantum Space and Matter by Associate Professor Reginald T Cahill, the deputy head of the School of Chemistry, Physics and Earth Sciences at The Flinders University in Adelaide, South Australia.
Cahill's paper confounds two broad philosophical claims and two major physical claims which may not be quite as interdependent as he portrays them:
While three of those four claims make at least general sense, it may prove to have been premature for Cahill to invest so much weight in his particular choice of micro scale information theoretic primitive, especially in the light of Stephen Wolfram's A New Kind of Science which expounds a plethora of "equivalent" "simple programs" and identifies a broad capacity for a major subclass of all such systems to generate randomness. Even though there is at least a hint of consensus even beyond Wolfram and Cahill that evolving networks (graphs in the graph-theoretic sense) are a likely candidate for that underlying primitive, we may still face problems reconciling that idea with observed liquid-like dynamics of the aether. (Three dimensional space is seen here as an emergent macro scale property of the network/aether.)
Last year in a post to TransForum I noted:
But the real miracle is the basic physics of space time energy matter we have long taken for granted because it has been inescapable. Lately I have come to realise that the journey from nothing at all to expanding transparent momentum-conserving spacetime is a lot more tortuous, a lot more surprising, a much grander miracle than the journey from chemical elements to elephants and orcas.
Unfortunately the prevailing wisdom in and about the theoretical physics community is that a "theory of everything" is at hand which might allow the physical universe to be accurately modeled by a single equation or a single simple program. Given the inescapable observation that physical phenomena do not appear to vary with location or rotation, nor even with time, other than though the variation of measurable quantities consequent of the uneven distribution of matter and energy, the anticipation on an underlying unifying theory is unsurprising, at least until we face up to the historic fact that the closer we think we are getting to such a theory, the further away it turns out to be.
As an information technology professional of 40 years standing, and notwithstanding the specific utility of Claude Shannon's information theory, I am underwhelmed by the enthusiasm of a generation seduced by the ubiquity of information technology to imagine that the multiverse itself, not just our models of it, might be an emergent property of some primitive information system. In this area I am happy to be old fashioned enough to think in terms of "dynamics" as defining what stuff does and "information" as defining what it might be possible to know about. At a practical level I have no real expectation that even the best information theoretic primitive models are going to be able to be implemented in foreseeable information systems with sufficient resolution to provide convincing models of observable phenomena. However they might only need to produce a few Galapagos finches to tell us what we need to know.
Earlier this year, I did some research into the simplest possible evolving network which meets my minimum criterion of generating a completely new graph each tick through purely local rules. While this model is clearly too simple to produce anything close to the computational universality that is a holy grail for complex systems researchers, it still produced its share of surprises, including persistence across generations of complex structures even while the component nodes and edges are generated afresh each tick. One of the assumptions of my work has been that an evolving network/graph needs to remain connected, in explicit contrast to the evolution from a pentatope seed into five persistent but disconnected structures, so I may not have yet quite come to terms with Cahill's idea of "gebit connectivity" with which:
The set (of M nodes) will be partitioned into finite subsets of mutually disconnected components, each having Ni nodes which are at least simply connected - that is, each Ni may be described by a non-directed graph.
This seems to demand two levels of connectivity, the connectivity implicit through the M nodes all being subject to the same iterator being separate from the local Ni node connectivity graphs which might determine elemental function. Certainly Cahill and I would agree that irreducible, persistent "embedded topological defects" is a good candidate descriptor for elementary matter. For now at least, I have found it easier to deal with models in which "local" connectivity is explicitly maintained between all elements rather than admitting a possibility that they are all kept together in a (vast) bucket while local connectivity might come and go, even if this latter idea might fit more comfortably with the notion of the aether acting more like a liquid.
In the remainder of this page I am going to try to avoid getting bogged down with such specifics of candidate information theoretic micro scale models to instead focus on a couple of the questions still to be answered by the aether flow hypothesis, no matter how right it might feel. I will then take an all too brief look at some of the philosophical issues and especially try to separate the useful bits from some of the baggage that is increasingly fashionable in such arguments.
One possible problem for aether flow is to explain the scale of turbulence which is consistent with observation. In ordinary circumstances, turbulent flow contains eddies, but macro scale eddies in the gravitational aether flow would appear to imply localised pockets of anti-gravity. Maybe turbulence is only significant where the flow is weakest, at galactic scales, with laminar flow anywhere the flow is stronger, at the scale of our solar system. But that would turn on its head the familiar dynamics of turbulence in confined liquids where it is the slowest flows that remain laminar. And what is the timescale of any turbulent variations?
A second potential problem is to explain the coupling between the motion and acceleration of the aether and of matter therein. The velocity of radial aether flow towards a spherical mass is proportional to r-2 at distance r from the centre of mass. This implies an acceleration of the aether flow proportional to r-5! We know that the acceleration of matter in the same position is proportional to r-2, so matter can only be loosely coupled to the aether flow. My expectation is that such loose coupling might correctly account for a few other observations such as Fresnel drag, but getting the numbers right may not be quite so simple.
It may be reasonable to propose that as well as those embedded topological defects which act as sinks for aether flow, i.e. ordinary matter, another set of embedded topological defects stabilised out of the initial chaos with the complementary property that they act as sources of aether flow. By producing aether, those sources would effectively move away from everything else and could thereby drive the Hubble expansion. While it might be very difficult to design an experiment to verify that the expansion is being primarily driven by aether flow out of intergalactic voids, it should be a lot easier to determine upper bounds for the rate of expansion at both galactic and solar system scales. Unlike the sinks, there would appear to be a need for an ever increasing supply of sources to push even basic cubic expansion, let alone any acceleration. My own interest in the veritable zoo of problems with the theory of gravity became more focused once I started to see the implausibility of the notion that the Hubble expansion was supposedly due to conservation of an initial impulse from the Big Bang.
From the perspective of History and Philosophy of Science, it is hard to conceive anything that might be more disruptive than for a consensus to arise that Einstein was mistaken in abandoning the notion of the aether. The Khunian notion of "paradigm shift" which dominates HPS contains an implicit notion that each such shift is a genuine step towards ultimate truth. How should one deal with a paradigm shift that proves to be mistaken, especially when the mistake was central to the standing of the iconic figure of 20th century science? What would that do for the reputation of science as an enterprise?
The implications for such unconnected fields as incorrigible religious attacks on the pillar of evolutionary theory don't bear thinking about. Society at large has never come to terms with the real significance of the scientific method of searching for truth, so if Einstein's reliance on a flawed interpretation of the Michelson-Morley results entered the public consciousness, the reputation of science as a whole would be greatly diminished. The baby would be thrown out with the bath water. Yet if it is to be true to itself, science has no choice but to remain open to self criticism and the wider criticism that must follow.
At the heart of this there is a connection to misperceptions about mathematical models. Public discourse overwhelming fails to deal with ideas above a certain level of complexity, so there is a general belief, which even pervades the scientific community, that if one model is accurate then all other models must ipso facto be wrong. While it may be obvious enough when you sit down to think about it that any number of very different models may concur with observation and experiment to the limits of accuracy, that fact may never be obvious to the body politic and is unlikely to even guide the practice of scientists. This very human need for one truth may be more a topic for cognitive science and thus beyond the intended scope of this commentary, so I will leave it there for now.
Planets and minds are powerful devices for turning the perpetual now towards a time dimension extrapolated into past and future, to the point of making almost credible a metaphysical claim that past, present and future are no more than aspects of a time dimension, and thus the fanciful notion of existence outside time from which past and future might be viewed in full fidelity. Process Philosophy, correctly in my opinion, throws a large bucket of cold water on that fantasy, but in doing so has become allied with a plethora of other speculations which might be loosely grouped as "New Age" if some of them weren't quite ancient, at least in human cultural terms, and none of which are all that helpful, especially when taken literally.
The idea at the heart of Wolfram's work is that much more of the world can be modelled by processes which determine the next local state via a "simple program" operating on the previous state of the "local" neighbourhood than can ever be modelled by traditional mathematics. Of course the oft-explored implication that the world might actually be the product of such a program running on a universal computer often clouds the issue, but that step too far is not directly relevant to Cahill's paper so we do not need to do more here than to remember to forget it. Rather, it does seem that at least some of Wolfram's identified categories of "simple program" are as likely to provide candidate Skyrmions as Cahill's "gebits", assuming, as I am happy to, that the Skyrme's notion of persistent (irreducible) topological defects (c.f. knots) are needed at the micro level.
So, we might take from Cahill's work that aether flow can potentially explain gravity consistently with some yet-to-be determined microstructure that provides sinks for the aether flow which are consistent with the dynamics of matter, to which I would add sources of aether flow ever deeper in the void. We may also take it that the only thing which truly exists is the ever changing now, albeit a now of which any observer can only truly experience a particular local view from which they might cleverly model patterns across time. However one price of the success of Cahill's claims will be the public and political status of science being severely diminished. Optimistically, I would like to think that we can build on the work of Chaitin on the randomness of truth and its relationship to the dynamics of human conversation before the Cassini probe starts returning irrefutable evidence of spherical asymmetry in gravitation around Saturn which cannot be accounted by General Relativity, unless of course the uncertainties about gravity continue to cancel so neatly when it comes to putting space probes exactly where we want them.
 Cahill has pointed out that his notion of "semantic information" is very different to the notion of "information" to which I am refering here. I have no trouble with the idea that what he is talking about is information that is understood by the iterating process and not by humans.
 Cahill has expressed concern over this point, providing me with another paper to read which I am still digesting. So I am leaving it in unaltered, albeit with this disclaimer, at least for now.
[ Memes index ] [ From the edge of chaos ] [ A simple evolving network ] [ Balancing life, human, posthuman ]