Ellis’s recent critique of varying speed of light theories has caused a flurry of blog posts. A summary of his five main points, in his own words, runs as follows.

Point 1: Any VSL theory involving variable speed of photon travel must of necessity be based on some other method of measuring spatial distances than radar.

Point 2: Any VSL theory based on changes in the metric tensor components must explain how it differs from GR and how time and space measurements are related to the metric tensor.

Point 3: Any VSL theory involving a change in the limiting speed will not be Lorentz invariant; the way Lorentz invariance is broken must be made explicit.

Point 4: Any VSL theory involving a change in the speed of the photon travel must eventually propose some other equations than standard Maxwell’s equations to govern electromagnetism.

Point 5: Any VSL theory must be done consistently in terms of its effects on the whole set of physical equations.

The critique takes a glaringly classical view of the cosmos, but one can hardly argue with the last four points. As far as popular VSL theories are concerned, I mostly agree with the critique. The first point, however, makes no sense at all! It is of course true that the very definition of the metre defines the speed of light locally, but this does not imply that we should compare measurements over cosmological distances using the same speed of light. How do we compare measurements over cosmological distances at all? The only work I am aware of which considers real measurements over large scales is Louise Riofrio’s analysis of the speed of light on an older Earth. In fact, sensible approaches to quantum gravity take the measurement of distance via photon travel far more seriously than it is considered in classical gravity. Connes has expressed this nicely in his motivation for a new spectral physics. A varying speed of light is introduced by Riofrio as a useful picture for what we observe in the cosmos from here. Ellis can hardly argue with the observed expansion of space, or rather the observed increase in distance between stationary objects over cosmic time, which is based very much upon the spectral concept of distance.

Ellis mentions Penrose’s approach to causality, so he probably won’t argue with anybody who wants to replace Maxwell’s equations immediately with sheaf cohomology in twistor theory. As Penrose himself says, twistor theory seems to have a bearing on quantum gravity, rather than general relativity itself. But it is well known that mass generation is a difficult question in twistor theory, although Hughston and Hurd made interesting progress in the late 1980s by combining two massless solutions to obtain an $H^2$ massive state. This work only stresses the importance of understanding higher dimensional non-Abelian cohomology, and in this framework a varying speed of light is really a minor concern.

The wording of Ellis’s points betrays some further prejudices about the mathematics being used in VSL investigations. For example, the whole set of physical equations is far too restrictive a notion for a category theorist. Causality takes us beyond the realm of mere set theory, as we have seen.

## CarlBrannen said,

June 1, 2007 @ 5:45 am

I don’t agree that the first point is senseless. He ameliorates his question by adding: “So the question for any specific proposed VSL theory is,

What viable alternative proposal for distance measurement is made?“I wonder how the geometric algebra people over at Cambridge would answer this first point.

As you are aware, I’m convinced that I have a VSL theory where all known particles (and photons) are made up of preons that travel at a fixed speed that is approximately c\sqrt(3) in a typical astronomical reference system (i.e. the sun or earth or milky way).

So I can use the preons to define distance. Difficulties in observing them are an experimental issue, not a theoretical problem.

I agree that the rest of his points have to be addressed. And that’s why I’ve been working fairly hard (for me) on Painleve metric stuff. I’ve got the 4th order R-K numerical differentiation stuff running and it is on the web at GravitySimulation.com

One point that is missing, interestingly, is the question of what happens to the equivalence principle. I’m convinced that there is a maximum possible gravitational (i.e. not cosmological) redshift.

## CarlBrannen said,

June 1, 2007 @ 6:34 am

Well I went over and defaced Motl’s comment section by adding the (soon to be deleted) comment:

“Important physics has never worked like that. Quite on the contrary. Every new major revolution in physics has shown that a certain conversion parameter was not only constant but it was meaningful to set it equal to one.”

Ah, strong induction, surely an important foundation stone of string theory.

But this sort of reasoning is considered garbage in mathematics.

## Matti Pitkanen said,

June 1, 2007 @ 7:38 am

I find nothing wrong in the idea of variable c as long as one specifies precisely how the measurement of the speed of light is defined.

I mentioned already earlier that

in many-sheeted space-time one indeed obtains different times for the propagation of photons along different space-time sheets from point A to point B because the distance is different in the induced metric for different seets. This picture is consistent with metric view about gravitation.

I agree with Carl about how dangerous it is to put hbar=1;-).

## Matti Pitkanen said,

June 1, 2007 @ 7:53 am

Kea said:

The wording of Ellis’s points betrays some further prejudices about the mathematics being used in VSL investigations. For example, the whole set of physical equations is far too restrictive a notion for a category theorist. Causality takes us beyond the realm of mere set theory, as we have seen.I almost agree. If one however takes light-like 3-surfaces as fundamental objects, one obtains almost topological QFT and S-matrix as timelike entanglement coefficients defining a functor from the category of Feynman cobordisms (see my blog) to the category of operators between Hilbert spaces of positive and negative energy Hilbert spaces. Lightlikeness brings in the notion of length measurement and makes the theory physically interesting.

S-matrix need not be unitary and

unless one has HFF of type II_11 one obtains by functor property S-matrix which is thermal S-matrix. Center of mass degrees of freedom bring unavoidably in factor of type I so that one cannot avoid thermodynamics for normalizable zero energy states.

This is very nice result since thermodynamics becomes part of quantum theory rather than being a mere practical fiction of theorist. Already the possibility to assign temperature to blackholes and p-adic mass calculations suggest this possibility as also the findings about states for hyper-finite factors of type III.

S-matrices and quantum states are parametrized by complex number whose real part has interpretation as duration of experiment and imaginary part as inverse temperature. By the functor property S-matrices and thus also quantum states allow product decomposition analogous to group multiplication.

## Kea said,

June 1, 2007 @ 9:31 pm

Hi Carl and Matti. The reason I don’t like the idea of ‘distance via preons’ (although I’m not arguing with your physics) is because I prefer to stick to actual observables wherever possible, such as photon frequency. Anyway, presumably Ellis was far from thinking about preons, and I think his point 1 has problems in addressing, for instance, Connes’ point of view.

## Confused said,

June 5, 2007 @ 8:06 am

Dear Kea,

I personally don’t really see the big deal about VSL. Even in general relativity, since one is dealing with Lorentzian manifolds, it seems pretty clear to me that the speed of light should vary from point to point depending on variations in the metric. (I think I agree with Matti here.)

It is only in flat (3,1) space ie Minkowski space, in the complete absence of matter, that the speed of light should be constant (or in places where the stress energy tensor is close to negligible, like the solar system in which we live). Or am I being just hopelessly naive?