Archive for category Uncategorized
Title: Conceptual framework for a novel non-local hidden-variable theory of physics: Cordus theory
17 Sept 2015, 15h00, venue Ers446
Content: As per http://vixra.org/abs/1104.0015
Most things in physics are symmetrical. This is evident in action-reaction forces and the conservation of momentum and energy. Particle interactions are generally also symmetrical. At least the Standard Model of particles predicts that particle interactions are symmetrical when charge, parity and time (hence CPT) transformations are all applied.
Which makes our latest findings all the more curious. In ‘Asymmetrical neutrino induced decay of nucleons’ (http://dx.doi.org/10.5539/apr.v7n2p1) we predict that the neutrino and antineutrino behave differently in their interactions with the proton and neutron. There are two parts to this prediction. First we predict that the neutrino and antineutrino (neutrino species) can cause (induce) decay. The conventional interpretation is that they are merely outcomes of the decay process, and are not involved in the input side at all. Second, we predict that the neutrino species induce decay differently with the proton and neutron (nucleons), hence asymmetrical decay. In contrast the conventional interpretation is that nucleon decay is purely random, that the mean lifetimes are constant, and that decay is not affected by the external environment.
This prediction is made on theoretical grounds by a logical extension of the Cordus theory, specifically by using its mechanics for discrete fields. This predicted asymmetry is novel and unorthodox. There is nothing in physics that disallows such an asymmetry, but neither is there any reason to expect it. We were therefore surprised that this asymmetry emerged. It was not something that we were actively seeking, but rather it was a supplementary exploration while we were searching for answers to the asymmetrical genesis problem. We have addressed the genesis situation elsewhere, and can explain how the universe came to be made up of more matter than antimatter (CP violation). The present paper takes a similar approach, in that it uses the same Cordus mechanics, but the starting point is different.
The problem is that the operation of neutrino detectors shows that nuclide decay rates can be affected by loading of neutrino species. However the underlying principles of this are poorly understood. The purpose of this paper was to develop a conceptual solution for the neutrino-species interactions with single nucleon decay processes. Single nucleons, i.e. a single proton or neutron, are a simplification of the more complex situation inside the nucleus of large atoms. The starting point was the non-local hidden-variable (NLHV) solution provided by the Cordus theory, specifically its mechanics for the manipulation of discrete forces and the remanufacture of particle identities. This mechanics was applied to the inverse beta decays and electron capture processes for nucleons. These are situations where the neutrino or antineutrino is supplied as an input, as opposed to being an output as in the conventional decays.
Our findings are that Inverse decays are predicted to be differentially susceptible to inducement by neutrino-species. The inverse β- neutron decay is predicted to be susceptible to neutrino inducement (but not to the antineutrino). Correspondingly β+ proton decay is predicted to be induced by input of energetic antineutrinos, but not neutrinos. Hence a species asymmetry is predicted. The inverse electron capture (EC) is predicted to be induced by pre-supply of either a neutrino or antineutrino, with different energy threshold requirements in each situation. The neutrino induced channel is predicted to have the greater energy barrier than the antineutrino channels.
We also have a third prediction. This is that one unified decay equation can be written to express β- neutron decay, β+ proton decay, and electron capture. Furthermore, that this equation applies to the conventional forward decays and the induced decays proposed here. The originality here is the proposed new methodology for predicting the outcomes of decays and particle transformations. If valid, this provides a simplification in the representation of the decay processes. An interesting little rule in the unified decay equation is that transfers across the equality result in inversion of the matter-antimatter species (hand).
What would be the implications if all this was valid? Well, the theory predicts the existence of a number of induced decays with asymmetrical susceptibility to neutrino-species. The results imply that detectors that measure β- outcomes are measuring neutrinos, and β+ antineutrinos. A novel prediction is made, that neutrino-species induce decay of nucleons, and that the interaction is asymmetrical. Hence also, that different decay types are affected differently by the input of energy and neutrino-species. A detailed explanation is provided of how this occurs at the level of the internal structures of the particules.
This is an unorthodox theory and an unexpected set of predictions. Whether or not this theory is valid we do not yet know, but it does make specific predictions that no other theory makes, which is interesting. None of these are contemplated from conventional theories of quantum mechanics, the standard model, or supersymmetry. This might seem a weakness, but is actually a good position to be in for concept development. If a theory can predict something specific that other theories cannot, then then that differentiates the theory. In this case the predictions are also testable. Consequently this gives a way to for the future to show whether or not it is valid (falsifiable). For the moment we simply state that it is a logical extension of the Cordus theory, and the outcomes are curious enough to be worth reporting.
Fundamental physics and cosmology intersect at genesis. Some really big and interesting problems arise here. One of these is to explain the production of matter at the initial genesis event (big bang). There are several sub-problems. These include a need to describe how mass-energy equivalence occurs at the fundamental level, i.e. how energy transforms into mass (E=mc^2). This is particularly important for pair-production, e.g. the conversion of photon energy into an electron and anti-electron (positron). It is also necessary to understand how the inverse process of annihilation occurs. Related to that, and an especially difficult problem, is to explain why more matter (electrons, protons) exists than antimatter (positrons, anti-protons). This is the asymmetrical genesis problem, and it has two parts: asymmetrical leptogensis, and asymmetrical baryogenesis, for the electrons and protons respectively.
However the problems don’t stop there. An explanation is also needed for how the neutron is made. This is much the easiest part, since the beta decays and electron capture processes are readily observable, unlike the other parts of the process. So conventional physics already provides theories in this area.
The next step is to explain how the atomic nucleus functions, i.e. how protons and neutrons are bound together. This is much more difficult. The strong force is thought to be the mechanism for this, but its workings at this level have not been solved. Also, the Universe is not made solely of hydrogen, but instead there are many elements, and each has many isotopes, with different lifetimes. All these nuclides need explaining too. Other problems to explain are electron orbitals (quantum theory is pretty advanced in this area), and the inflation process.
In summary, the genesis problem is to explain how energy was converted into the diverse forms of matter that we observer in our Universe.
We apply a production and systems engineering method to this problem. To our way of thinking there exists a GENESIS PRODUCTION SEQUENCE (GPS). We seek to determine what kind of processes could be involved. Our approach is a systems-engineering one based on the premise of physical realism: we take the attributes of the observable universe, and from those infer the necessary functionality of deeper proceses (sub-systems). Where necessary we use design thinking to creatively anticipate the mechanisms that support those deeper systems. We require a logical continuity of explanation throughout the solution that emerges, and constantly test and adjust the theory to achieve this. This provides coherence. We then logically extend the theory to other phenomena, and explore whether it is able to give solutions to those new areas. In this way we test the external construct validity, and further change or extend the theory as appropriate. We have been doing this for several years, and have steadily advanced our coverage of fundamental physics and cosmology. The result is the Cordus theory.
Now the pieces of the genesis production sequence are starting to come together.
The Cordus theory now provides solutions for much of the genesis production sequence. We now understand the processes of mass-energy equivalence, at least at a conceptual level, and this includes both pair-production and annihilation.The asymmetrical genesis problem also has a solution, for both leptogenesis and baryogenesis. [See previous post]. We can also explain the processes for beta decay, and the stability attributes of the nuclides (H to Ne).
The diagram below summarises the production processes from photons to the electron, proton and neutron. Hence the origin of all the basic building blocks of the matter Universe can be explained. The coloured objects in the large central blocks are the inferred internal structures of the various particles.
This is an interesting development for two reasons.
First, it is surprising that a theory based on a non-local hidden-variable design with discrete fields should have this degree of explanatory power. Physics had otherwise given up on NLHV designs. Many have attempted to mathematically disprove even the possibility of their existence, as per the Bell-type inequalities. However the NLHV designs have never been fully disproved on theoretical grounds, and now a design has emerged that shows how powerful they can be in an explanatory sense. By fielding a workable solution, in the form of the Cordus theory, means that the Bell-type inequalities are falsified. This is surprising to many, to the point of disbelief.
Second, it is interesting that a candidate theory now exists for much of the genesis production sequence. Other theories of physics, such as quantum mechanics, relativity, and string theory, have solutions for pieces of this. But their explanations do not go anywhere near the coherence and breadth of this. The new theory starts to show the limitations of the old theories, and starts to subsume them. Consequently the implication is that there is a new physics provided in this internal variable theory. This is difficult for orthodox physicists to accept. We see this disbelief in Reviewers’ comments. They say it is impossible that quantum mechanics is not the solution. Physicists have so much intellectual investment in quantum theory (in particular), that they cannot but persist with QM. In systems engineering we call that a sunk-cost bias.
You can read more about the genesis solution in paper .
1. Pons, D. J. and Pons, A., D., Outer boundary of the expanding cosmos: Discrete fields and implications for the holographic principle The Open Astronomy Journal, 2013. 6: p. 77-89. DOI: http://dx.doi.org/10.2174/1874381101306010077.
2. Pons, D. J., Pons, A., D., and Pons, A., J., Time: An emergent property of matter. Applied Physics Research, 2013. 5(6): p. 23-47. DOI: http://dx.doi.org/10.5539/apr.v5n6p23
3. Pons, D. J., Pons, A., D., and Pons, A., J., Beta decays and the inner structures of the neutrino in a NLHV design. Applied Physics Research, 2014. 6(3): p. 50-63. DOI: http://dx.doi.org/10.5539/apr.v6n3p50
4. Pons, D. J., Pons, A. D., and Pons, A. J., Explanation of the Table of Nuclides: Qualitative nuclear mechanics from a NLHV design. Applied Physics Research 2013. 5(6): p. 145-174. DOI: http://dx.doi.org/10.5539/apr.v5n6p145
5. Pons, D. J., Pons, A. D., and Pons, A. J., Synchronous interlocking of discrete forces: Strong force reconceptualised in a NLHV solution Applied Physics Research, 2013. 5(5): p. 107-126. DOI: http://dx.doi.org/10.5539/apr.v5n5107
6. Pons, D. J., Pons, A. D., and Pons, A. J., Differentiation of Matter and Antimatter by Hand: Internal and External Structures of the Electron and Antielectron. Physics Essays, 2014. 27: p. 26-35. DOI: http://vixra.org/abs/1305.0157.
7. Pons, D. J., Pons, A. D., and Pons, A. J., Annihilation mechanisms. Applied Physics Research 2014. 6(2): p. 28-46. DOI: http://dx.doi.org/10.5539/apr.v6n2p28
8. Pons, D. J., Pons, A. D., and Pons, A. J., Asymmetrical genesis by remanufacture of antielectrons. Journal of Modern Physics, 2014. 5: p. 1980-1994. DOI: http://dx.doi.org/10.4236/jmp.2014.517193.
9. Pons, D. J., Pons, A. D., and Pons, A. J., Weak interaction and the mechanisms for neutron stability and decay Applied Physics Research, (IN PRESS)
10. Pons, D. J., Pons, A. D., Pons, A. M., and Pons, A. J., Wave-particle duality: A conceptual solution from the cordus conjecture. Physics Essays, 2012. 25(1): p. 132-140. DOI: http://physicsessays.org/doi/abs/10.4006/0836-1398-25.1.132.
Venture to where the wild things are?
Indeed the behaviour of particles is peculiar. They behave as waves or points, depending on how one interrogates them. The article follows that path and then shows how usual explanations about contextual measurement lead to absurdities: the role of the conscious observer in collapsing quantum indeterminacy is an unsatisfactory model.
The article then relentless pursues a reductive explanation for physics, and shows that pure mathematics would be the underlying reality if that path is taken. (I am not sure that I followed or agreed with all the logic though). Hence likewise back to mental entities and an observer.
I loved these circular concepts, for their philosophical wrangling and the stark conclusion that philosophy does not give an answer either way. However, perhaps those are not the only two options? Could there be others? There was one such candidate, which I did not miss. This is the usual escape hatch by which physics evades the effort of thinking about meaning, namely the many worlds theory. What a relief not to have that given an airing!
There may be other candidates, more deserving of consideration. The article might have gone on to discuss locality and local realism, hence also entanglement, and how solutions might emerge from that direction.
In particular, a logical case may be made that weird interpretations result, not as the article implies from the choice of solution path taken, but because there is something very broken with one of the fundamental tacit premises.
We note that the entire weight of the ontology of QM’s physics rests on a zero-dimensional point. It is thus hardly surprising that QM gives us weird explanations, singularities, and circular reasoning. What is even more surprising is the persistent adherence to the ontology, instead of a serious questioning of whether such premises might be wrong. But if not a point, then what? And don’t the Bell-type inequalities preclude anything but a 0-D point at the root of all matter?
They would seem to, though there is reason to believe that those inequalities might be based on circular reasoning of their own. They start by implicitly assuming a 0-D point structure and then conclude that matter can have no internal structure. Duh! Is that not circular?
If so then that opens the possibilities for other concepts for matter. String theory being one, though it yields only mathematical solutions rather than physical interpretations and therefore seems more in the reductionist line of thinking. But there are other non-local hidden variable solutions, and some of these already provide physically natural explanations, devoid of metaphysical weirdness, for all the paradoxes here mentioned and more besides: Airy patterns, wave-particle duality, entanglement interferometers, among others.
Expectedly, there is cost, which is that the 0-D point construct would have to be abandoned. And locality too. Some would say these are very light costs, to be gaining so much more explanatory power than extant theories.
My point is therefore that the article only explores two of the possible solutions to the reality question, and finds answers in neither of them. My criticism then is that the article stays too close to the campfire of the safe orthodoxy, which we already knew is beyond weird. What will physics have? Stay with the orthodoxy and accept the weirdness and epistemic stasis? Or venture to where the wild things are, and explore the raw new ideas?
Our own work in the wild side is readily available at http://vixra.org/
- Jan Westerhoff (2012) Reality: Is matter real? New Scientist
The strong force is that which binds the protons and neurons inside the nucleus of the atom. The protons, which each have a unit positive charge, would otherwise repel reach other. Thus the bonding force has to be ‘strong’ enough to overcome the electrostatic force.
The strong force also binds the quarks together inside the proton (or neutron). The binding mechanism at this level is currently represented by quantum chromodynamics (QCD). It proposes that the quarks exchange gluon particles of three different types. These are conveniently called ‘colour’ (hence also ‘chromo’), but that should be understood as representative rather than literal. QCD is very successful, though it does have limitations. A niggly philosophical one, some might say inconsequential, is that it introduces a new ‘colour force’, which thus also needs an explanation.
It should eventually be possible to model the structure of atomic nuclei, i.e. the elements and the nuclides, from the ground up using the strong force. However that bigger problem is still unsoved, and this is a more obvious limitation of QCD. It is a big problem, because it means that we cannot yet make the connections between physics and chemistry at the fundamental level. There simply has been no traction on solving this. Which of course is strange, since the pupose of the strong force in the first place is to bind the nucleus together. So a theory that describes the one should solve the other too.
Which raises the possibility that the existing models for the strong force may be deficient. That’s a possibility that we explore in our latest work, ‘Strong interaction reconceptualised‘.
We show that it is possible to make a different case for the strong force. In this model the strong force arises from the synchronisation of discrete field elements between particules. This causes the participating particules to be interlocked: the interaction pulls or repels particules into co-location and then holds them there, hence the apparent attractive-repulsive nature of that force and its short range.
This is a different way of looking at the force. It is an efficient theory, because it does not need the force to change its nature depending on range. Also, it does not introduce any new variables or new types of force. In a later post we plan to expand on the implicat
ions of this, but for now the main point is the strong force might be better represented as a synchronisation of frequency between two particles, rather than the exchange of particles.
Theories that challenge the orthodoxy quite rightly have a difficult reception. And while it is true that hidden-variable solutions fit in that category, there are some renowned phyicists that support this type of interpretation.
One of these is Antony Valentini of the Perimeter Institute for Theoretical Physics. Valentini supports the idea of non-locality, and has worked extensively on hidden-variable solutions, particularly the de Broglie-Bohm model. He proposes that the deeper determinism is hidden from us by statistical noise.
More controversially, he proposes that quantum mechnics might apply to matter in this epoch, but not necessarily always, hence ‘‘non-quantum’ … matter might exist today in the form of relic particles from the early universe’ .
It is an interesting idea, and may yet have exciting results. Afterall the pilot-wave theory made important contributions to the formation of what we now know as quantum mechanics, before being largely abandoned (see Why does nobody like pilot-wave theory?), and it would be interesting to see it back in the front of developments.
We need fresh ideas like these. Physics suffers from a fundamental epistemic incongruence:
‘For a theory that has the world’s finest physicists baffled, quantum mechanics is fantastically successful. … But it is also strange, frustrating and incomprehensible.’ (Chown, 2002)
It is the role of thinking to identify the starting points of the possible new solution paths. Valentini is doing just this. His approach is a bold one, , and we wish him well.
By comaprison, the cordus conjecture also supports a non-local hidden-variable model, though not the particular theory of de Broglie-Bohm. Our model is here http://vixra.org/abs/1203.0086
Regardless of what theory might finally emerge to be correct, the search for solutions within the hidden-variable family of designs is worth contemplating. It has the potential to unlock a more complete and coherent theory of fundamental physics. Let’s face it, quantum mechanics has been going for three-quarters of a century, and while its maths works well-enough, it still can’t give a coherent descriptive explanation of reality. Obviously something is going to have to change, and it remains to be seen whether QM is up to it.
Chown, M., 2002, Core reality, New Scientist
An Interview with Antony Valentini, 202, Metanexus
Antony Valentini , Wikipedia
Towler, M., 2009, Why does nobody like pilot-wave theory?
 Valentini, A., Subquantum Information and Computation. Pramana Journal of Physics, 2002. 59(2): p. 269-277. DOI: 10.1007/s12043-002-0117-1. Available from: http://arxiv.org/abs/quant-ph/0203049.
They said it
‘I quickly decided that I personally didn’t like pilot-wave theory, partly because it seems to me that it throws out all the deep, amazing and experimentally verified links between modern physics and mathematics that motivate what I love about the subjects, getting nothing much in return. I don’t see a good reason to believe that research in this area is going to lead to something interesting, but those who do have every right to keep trying.’ Peter Woit, here
Bell’s warning: ‘Many [regard] investigating the roots of quantum reality as “crackpot physics”‘ Anil Ananthaswamy in New Scientist
‘Valentini will have a hard time convincing sceptics. But it could be worth it. “It would mean that physics was finally making progress on a problem on which we have been stuck for many decades,” says Smolin. “Right now we’re staring into a sort of quantum fog,” says Valentini. “If we admit that an unexplored level might lie behind it, a whole new world comes into focus.” ‘ Chown, M., 2002, Core reality, New Scientist
Just suppose the quantum world is built on more solid foundations. It could explain a lot of weird stuff, says Marcus Chown (Core reality)
Now! Wave-particle duality for cars!
More seriously, one of the all time mysteries is why quantum mechanics does not scale up to this level. QM seems to accurately describe the doings of the atomic world of particles. Yet it does not seem to scale up to macroscopic objects. For example, the biggest objects that have been shown to pass through the double-slit experiment are molecules (see below). Nothing like a car.
Quantum mechanics itself can’t explain why it doesn’t scale up to macroscopic objects. But with the cordus conjecture we think we can.
The answer, we propose, has to do with two factors. The first is the scale over which coherence can be sustained. Very roughly, coherence refers to all the particles moving as one. QM assumes that coherence applies without limit, to the large assemblies of atoms that make up macroscopic bodies like cars, cats, and the objects that we can see with our senses. Clearly that is not the case, because cars don’t go through both exits at once, nor are cats simultaneously alive and dead (Schrodinger’s cat). Cordus explains why coherence does not apply to these objects. It has to do with the inability of every atom in the object to move as one. In turn that inability arises because of the mixture of atoms/molecules, the internal flows of material (especially important for living creatures), and the warm temperature of these objects. So these objects are not in internal coherence. Therefore they are not in geometric superposition. Which means they cannot be in two places at once. Therefore the car cannot go through both exits at once.
The second factor is that even if a macroscopic object can be placed in coherence (which we predict is likely but only for small cold inanimate objects), then the required width of the slits is predicted to be as wide as the object, so the two slits will merge into a simple gap, and the resulting positional uncertainty of the object once through the gap (i.e. fringes) will be small.
You can read more about this interesting topic below, including some recent examples of large molecules going through the double-slit deice, and our own more detailed explanations of Quantum mechanics’ scaling problem.
- The Double-Slit Garage Experiment http://www.improbable.com/airchives/paperair/volume7/v7i6/doubleslit.html
- Particle-wave duality demonstrated with largest molecules yet (arstechnica.com)
- Quantum mechanics on steroids: Even the largest molecules behave like waves (mnn.com)
- Why does quantum mechanics not scale up? http://vixra.org/abs/1107.0019
- Limits of Coherence: Where and Why is the Transition to Discoherence? http://vixra.org/abs/1201.0043