I found a type of applied mathematics which is an evolution from Dynamic Noncooperative Game Iheory.

Idempotents within semirings are discussed.

An operator that may resolve the UV problem is also used

I am not familiar with preons and have some reading to do.

]]>In TGD standard model quantum numbers are explained in terms of symmetries of imbedding space so that something totally new is in question. Higgs is also present but couples weakly to the fermions since p-adic thermodynamics gives the dominant contribution to fermion masses. In very optimistic mode one could take the latest indications about inconsistency of decays of rates of Higgs to b- and tau-pairs as evidence for TGD picture about Higgs (see this and this)

Number theoretical braids give rise to additional number theoretic degrees of freedom including Galois groups acting as symmetries permuting strands of the braids. The unitary braiding matrix in principle makes possible topological quantum computation like processes for large partonic 2-surfaces and perhaps even at elementary particle level so that elementary particles might be much more than we have been accustomed to think.

One fascinating prediction is replication of braids in parton decay vertex and would correspond to a faithful copying of classical information. Parton exchanges would correspond to its communication. Since partonic surfaces appear in all length scales there is strong temptation to think that DNA replication is basically braid replication and that DNA acts as a topological quantum computer. One of the TGD based models for genetic code conforms with this interpretation.

One could even consider the possibility that Universe acts as topological quantum computer in all scales. In a quantum theory based on hyper-finite factors of type II_1 and hierarchy of quantized Planck constants explaining dark matter and dark energy this is more or less unavoidable prediction.

Best Regards, Matti

]]>Because these are joined at the top, Dr Motl in his angry Amazon review of Dr Smolin’s latest book, dismissed this claim for LQG to account for the Standard Model as being merely “octopusses swimming in the [Penrose] spin network”.

Of course an octopus has eight legs, and these only have three. But you don’t seriously expect a string theorist like Dr Motl to get simple numbers correct. (After all, these people claim there are 10/11 dimensions.)

The problem with all abstract *ad hoc* models is that they are not really telling you anything unless you can get predictions from them. It reminds me of the epicycles, caloric, phlogiston, mechanical gear box type aether and the vortex atom.

These junk mainstream efforts were successful models in small (but over-hyped) way for a limited range of phenomena, but they ended up hindering the progress of science.

It’s a maybe a difficulty that people are trying to build speculative, abstract models that don’t have physical explanations for real, experimentally validated phenomena:

‘It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of spacetime is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities.’ – R. P. Feynman, Character of Physical Law, November 1964 Cornell Lectures, broadcast and published in 1965 by BBC, pp. 57-8.

I think that some aspects of the Higgs mechanism are probably vital.

Colour force does seem to arise when you two or three confine charges, basically leptons. The colour force arises because of the interaction between the confined charges, and the attenuation of electromagnetic charge energy by the polarized vacuum.

But the idea is that at high energy all forces are closely related but some gauge bosons acquire mass and get shielded by the vacuum somehow, limiting their range.

If you think about it, the electromagnetic charge is being attenuated all around a particle, out to the IR cutoff distance ~1fm, by the polarization of pairs produced in the intense electric field out to that distance, >10^18 v/m.

*What happens to the energy that is absorbed from the electromagnetic field by polarization caused shielding?*

Clearly, the vacuum attenuated electromagnetic field energy of the particle is used to produce high energy loops, and the loops polarize. It’s a physically real process.

Now consider that somehow you could fire 3 electrons very close to one another. The overall electric field is then 3 times stronger, so the polarization of charge pair production in the vacuum is 3 times greater, hence the electromagnetic force per charge is shielded 3 times more strongly at a distance, so thateach electron has an apparent electric charge – as seen outside the IR cutoff – of exactly -1/3.

This is the charge of the downquark.

Naturally, the full details are complex because of the Pauli exclusion principle, the weak hyper charge, and strong forces. Because of the complexity, you will get +2/3 for the upquark, etc.

However, it’s clear that if this basic physical idea is a helpful clue, then the difference between a lepton and a quark is produced due to vacuum field effects on the proximity of leptons. The electromagnetic energy shielded, i.e., 2/3rds of the energy when an electron becomes effectively a downquark, is used to create the strong or colour charge, which is completely absent when the lepton is not confined in a pair or triad.

Woit has shown how Representation Theory may unify all Standard Model forces, electromagnetic, weak and strong forces. See p51 of his paper, http://arxiv.org/abs/hep-th/0206135.

Unification implies a relationship between all types of force, and thus all types of charge. If there is any unified theory of fields, then it will explain how forces are different aspects of the same thing.

The Standard Model is just experiment-based symmetry groups. It’s not a theory by itself. It can’t even unify electromagnetism and the weak force into the electroweak force without the Higgs sector which is speculative as to the number and type of “Higgs bosons”.

The problem with Feynman’s idea of great simplicity in physical mechanism, is that the apparent complexity of the Standard Model and general relativity will only arise from a kind of Rube-Goldberg universe. So the underlying simplicity is covered up by numerous simple mechanisms, working together to create the apparent complexity of nature.

I’ve been reading Carl’s papers and they are very interesting.

One thing I don’t understand is the attention given to the formula

[(e^n + m^n + t^n)^(1/n)]/(e + m + t) = 3/2 if n = 1/2.

Where the masses of electron (e), muon (m) and tauon (t) are involved.

If n = 3/2, the result is 3/2. If n = 1, then the sum is obviously 1.

I don’t see what physical significance this has. It looks like numerology, where the reader is impressed that this way of averaging the masses gives the result 3/2. With other values of n, you get different dimensionless results. What is special about 3/2?

I know it comes into the empirical relationship between electron mass, muon mass, and alpha:

muon mass ~ electron mass *(3/2)/alpha

= 0.511 MeV * (3/2) * 137.036…

= 105.0 MeV,

but this is only approximate, since the muon mass is about 105.66 MeV.

Hence, the 3/2 factor there should be replaced by about 1.51. On the other hand, it’s clear that whatever the final theory is, there should be some clues in data about masses of particles and so on, just as the periodic table was assembled empirically before being explained theoretically.

These ideas are very interesting and should be published on arXiv.

]]>