Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The two-state-vector formalism presents a time-symmetric approach to the standard quantum mechanics, with particular importance in the description of experiments having pre- and post-selected ensembles. In this paper, using the correspondence limit of the quantum harmonic oscillator in the two-state-vector formalism, we produce harmonic oscillators that possess a classical behavior while having a complex-valued position and momentum. This allows us to discover novel effects that cannot be achieved otherwise. The proposed classical behavior does not describe the classical physics in the usual sense since this behavior is subjected to the feature that only after the final measurement is performed, as a boundary condition, the complex-valued classical effects occur between the initial and the final boundary conditions. This classical behavior breaks down if one does not follow the decided boundary conditions during the experiment, since otherwise, one will contradict the impossibility of retrocausality. PubDate: 2022-05-20

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The well known Bell experiment with two actors Alice and Bob is considered. First the simple deduction leading to the CHSH inequality under local realism is reviewed, and some arguments from the literature are recapitulated. Then I take up certain background themes before I enter a discussion of Alice’s analysis of the situation. An important point is that her mind is limited by the fact that her Hilbert space in this context is two-dimensional. General statements about a mind’s limitation during a decision process are derived from recent results on the reconstruction of quantum theory from conceptual variables. These results apply to any decision situation. Let all the data from the Bell experiment be handed over to a new actor Charlie, who performs a data analysis. But his mind is also limited: He has a four-dimensional Hilbert space in the context determined by the experiment. I show that this implies that neither Alice nor Charlie can have the argument leading to the CHSH inequality as a background for making decisions related to the experiment. Charlie may be any data analyst, and he may communicate with any person. It is argued that no rational person can be convinced by the CHSH argument when making empirical decisions on the Bell situation. PubDate: 2022-05-14

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In recent years, the non-relativistic quantum dynamics derived from three assumptions; (i) probability current conservation, (ii) average energy conservation, and (iii) an epistemic momentum uncertainty (Budiyono and Rohrlich in Nat Commun 8:1306, 2017). Here we show that, these assumptions can be derived from a natural extension of classical statistical mechanics. PubDate: 2022-05-12

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract We give a conceptually simple proof of nonlocality using only the perfect correlations between results of measurements on distant systems discussed by Einstein, Podolsky and Rosen—correlations that EPR thought proved the incompleteness of quantum mechanics. Our argument relies on an extension of EPR by Schrödinger. We also briefly discuss nonlocality and “hidden variables” within Bohmian mechanics. PubDate: 2022-05-07

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The vision of quantum physics developed by David Bohm, and especially the idea of the implicit order, can be considered the true epistemological foundation of quantum field theory and the idea of a quantum vacuum that underlies the observable forms of matter, energy and space-time. Assuming the non-locality as the crucial visiting card of quantum processes, it is thus possible to arrive directly to the transactional interpretation and to the idea of a non-local quantum vacuum in which the behaviour of a subatomic particle constitutes the manifestation of more elementary creation/annihilation processes of quanta. PubDate: 2022-05-03

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract We present a mechanical model of a quasi-elastic body (aether) which reproduces Maxwell’s equations with charges and currents. Major criticism (in: Sommerfeld, Mechanics of deformable bodies, lectures on theoretical physics, Academic Press, Inc., London, 1964) against mechanical models of electrodynamics is that any presence of charges in the known models appears to violate the continuity equation of the aether and it remains a mystery as to where the aether goes and whence it comes. We propose a solution to the mystery—in the present model the aether is always conserved. Interestingly it turns out that the charge velocity coincides with the aether velocity. In other words, the charges appear to be part of the aether itself. We interpret the electric field as the flux of the aether and the magnetic field as the torque per unit volume. In addition we show that the model is consistent with the theory of relativity, provided that we use Lorentz–Poincare interpretation (LPI) of relativity theory. We make a statistical-mechanical interpretation of the Lorentz transformations. It turns out that the length of a body is contracted by the electromagnetic field which the molecules of this same body produce. This self-interaction causes also delay of all the processes and clock-dilation results. We prove this by investigating the probability distribution for a gas of self-interacting particles. We can easily extend this analysis even to elementary particles. PubDate: 2022-05-03

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The central limit theorem has been found to apply to random vectors in complex Hilbert space. This amounts to sufficient reason to study the complex–valued Gaussian, looking for relevance to quantum mechanics. Here we show that the Gaussian, with all terms fully complex, acting as a propagator, leads to Schrödinger’s non-relativistic equation including scalar and vector potentials, assuming only that the norm is conserved. No physical laws need to be postulated a priori. It thereby presents as a process of irregular motion analogous to the real random walk but executed under the rules of the complex number system. There is a standard view that Schrödinger’s equation is deterministic, whereas wavefunction “collapse” is probabilistic (by Born’s rule)—we have now a demonstrated linkage to the central limit theorem, indicating a stochastic picture at the foundation of Schrödinger’s equation itself. It may be an example of Wheeler’s “It from bit” with “No underlying law”. Reasons for the primary role of \(\mathbf {C}\) are open to discussion. The present derivation is compared with recent reconstructions of the quantum formalism, which have the aim of rationalizing its obscurities. PubDate: 2022-04-23

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The aim of this paper is to invalidate the hypothesis that human consciousness is necessary in the quantum measurement process. In order to achieve this target, I propose a considerable modification of the Schrödinger’s cat and the Dead-Alive Physicist thought experiments, called “PIAR”, short for “Physicist Inside the Ambiguous Room”. A specific strategy has enabled me to plan the experiment in such a way as to logically justify the inconsistency of the above hypothesis and to oblige its supporters to rely on an alternative interpretation of quantum mechanics in which a real world of phenomena exists independently of our conscious mind and where observers play no special role. Moreover, the description of the measurement apparatus will be complete, in the sense that the experiment, given that it includes also the experimenter, will begin and end exclusively within a sealed room. Hence, my analysis will provide a logical explanation of the relationship between the observer and the objects of her/his experimental observation; this and a few other implications will be discussed in the fourth section and in the conclusions. PubDate: 2022-04-20

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Process ontology is making deep inroads into the hard sciences. For it offers a workable understanding of dynamic phenomena which sits well with inquiries that problematize the traditional conception of self-standing, definite, independent objects as the basic stuff of the universe. Process-based approaches are claimed by their advocates to yield better ontological descriptions of various domains of physical reality in which dynamical, indefinite activities are prior to definite “things” or “states of things”. However, if applied to physics, a main problem comes up: the notion itself of process appears to pivot on a conception of evolution through time that is at variance with relativistic physics. Against this worry, this article advances a conception of process that can be reconciled with general relativity. It claims that, within timeless physical frameworks, a process should not be conceived as activities evolving through time. Rather, processes concern the identity that entities obtain within the broader sets of relations in which they stand. To make this case, the article homes in on one of the physical approaches that most resolutely removes time from the basic features of reality, that is, canonical quantum gravity. As a case in point, it addresses Carlo Rovelli’s Evolving Constant approach as a physical paradigm that resolutely rejects time as an absolute parameter and recasts processualism as an inquiry into how physical systems affect one another. PubDate: 2022-04-18

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract It is often claimed that one cannot locate a notion of causation in fundamental physical theories. The reason most commonly given is that the dynamics of those theories do not support any distinction between the past and the future, and this vitiates any attempt to locate a notion of causal asymmetry—and thus of causation—in fundamental physical theories. I argue that this is incorrect: the ubiquitous generation of entanglement between quantum systems grounds a relevant asymmetry in the dynamical evolution of quantum systems. I show that by exploiting a connection between the amount of entanglement in a quantum state and the algorithmic complexity of that state, one can use recently developed tools for causal inference to identify a causal asymmetry—and a notion of causation—in the dynamical evolution of quantum systems. PubDate: 2022-04-13

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Recent advances in differential topology single out four-dimensions as being special, allowing for vast varieties of exotic smoothness (differential) structures, distinguished by their handlebody decompositions, even as the coarser algebraic topology is fixed. Should the spacetime we reside in takes up one of the more exotic choices, and there is no obvious reason why it shouldn’t, apparent pathologies would inevitably plague calculus-based physical theories assuming the standard vanilla structure, due to the non-existence of a diffeomorphism and the consequent lack of a suitable portal through which to transfer the complete information regarding the exotic physical dynamics into the vanilla theories. An obvious plausible consequence of this deficiency would be the uncertainty permeating our attempted description of the microscopic world. We tentatively argue here, that a re-inspection of the key ingredients of the phenomenological particle models, from the perspective of exotica, could possibly yield interesting insights. Our short and rudimentary discussion is qualitative and speculative, because the necessary mathematical tools have only just began to be developed. PubDate: 2022-04-09

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Beginning with the Everett–DeWitt many-worlds interpretation of quantum mechanics, there have been a series of proposals for how the state vector of a quantum system might split at any instant into orthogonal branches, each of which exhibits approximately classical behavior. Here we propose a decomposition of a state vector into branches by finding the minimum of a measure of the mean squared quantum complexity of the branches in the branch decomposition. In a non-relativistic formulation of this proposal, branching occurs repeatedly over time, with each branch splitting successively into further sub-branches among which the branch followed by the real world is chosen randomly according to the Born rule. In a Lorentz covariant version, the real world is a single random draw from the set of branches at asymptotically late time, restored to finite time by sequentially retracing the set of branching events implied by the late time choice. The complexity measure depends on a parameter b with units of volume which sets the boundary between quantum and classical behavior. The value of b is, in principle, accessible to experiment. PubDate: 2022-04-05

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The quantum transition probability assignment is an equiareal transformation from the annulus of symplectic spinorial amplitudes to the disk of complex state vectors, which makes it equivalent to the equiareal projection of Archimedes. The latter corresponds to a symplectic synchronization method, which applies to the quantum phase space in view of Weyl’s quantization approach involving an Abelian group of unitary ray rotations. We show that Archimedes’ method of synchronization, in terms of a measure-preserving transformation to an equiareal disk, imposes the integrality of the quantum of action, and requires the extension of the classical moment map from the real line to the circle. Additionally, the same synchronization method is encoded in the structure of the Heisenberg group, viewed as a principal bundle with a connection, whose curvature and anholonomy is expressed in terms of area bounding loops in relation to the underlying Abelian shadow on a symplectic plane. In this manner, we show that the geometric phase pertains to the minimal synchronized area \(\pi \hbar \) of the 2-d symplectic Abelian shadow of the symplectic ball, modulo \({\mathbb {Z}}\) . The integrality condition naturally leads to the consideration of modular commutative observables pertaining to the role of the discrete Heisenberg group. We prove that the structural transition from non-commutativity to modular commutativity in accordance to Weyl’s group-theoretic commutation relations takes place via universal factorization through the discrete Heisenberg group. In this way, we derive a homology-theoretic formulation of the synchronization method in terms of the area-bounding cells of the modular lattice \(\frac{{\mathbb {R}}^2}{{\mathbb {Z}}^2}\) in relation to any Abelian symplectic shadow. Thus, we finally obtain the physical interpretation of the analytic representation of quantum states as theta functions corresponding to the sections of a complex line bundle with an integral symplectic structure. PubDate: 2022-04-05

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The quantum transition probability bears the symmetry of a process “evolving” around a symplectic area-bounding loop in the projective Hilbert space that essentially underlies the notion of a global geometric phase. The basic idea is that this symmetry can be associated with a synchronization procedure in the quantum phase space, which is only implicit due to the insistence of interpreting the temporal variable as a classical one, overlooking the subtle interrelation of the complex with the symplectic structure. This leads to the qualification of quantum amplitudes in terms of symplectic spinorial objects which doubly cover the corresponding complex vectors obtained through squaring. Their interrelation can be simply formulated in terms of null vectors of a 3-d Minkowski space, which proves to be instrumental for the Hermitian and unitary representations of the symplectic group. In this light, we show that the quantum transition probability assignment constitutes an equiareal transformation from the annulus of spinors to the disk of complex vectors, which makes it equivalent to the equiareal measure-preserving transformation of Archimedes. This realization ignited a process of re-evaluating the original works of Archimedes in light of their conceptual significance. It turns out that the symplectic method of equiareal projection on a disk pertains to a precise synchronization procedure that can be applied to the quantum phase space. From this viewpoint, we examine the pertinent notions of time, entangled symplectic area, objective indistinguishability, and qualify physically the non-squeezing theorem of symplectic geometry. PubDate: 2022-04-05

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The Mössbauer rotor effect recently gained a renewed interest due to the discovery and explanation of an additional effect of clock synchronization which has been missed for about 50 years, i.e. starting from a famous book of Pauli, till some recent experimental analyses. The theoretical explanation of such an additional effect is due to some recent papers in both the general relativistic and the special relativistic frameworks. In the first case (general relativistic framework) the key point of the approach is the Einstein’s equivalence principle (EEP), which, in the words of the same Einstein, enables “the point of view to interpret the rotating system K’ as at rest, and the centrifugal field as a gravitational field”. In this paper, we analyse both the history of the Mössbauer rotor effect and its interpretation from the point of view of Einstein’s general theory of relativity (GTR) by adding some new insight. In particular, it will be shown that, if on one hand the “traditional” effect of redshift has a strong analogy with the gravitational redshift, on the other hand the additional effect of clock synchronization has an intriguing analogy with the cosmological redshift. Finally, we show that a recent claim in the literature that the second effect of clock synchronization does not exist is not correct. PubDate: 2022-03-29

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract We report to which extent elementary particles and the nucleons can be described by an Ansatz that is alternative to the established standard model, and can still yield predicted results that reproduce the observed ones, without using the formalism of quantum mechanics. The different Ansatz is motivated by the attempt to explain known properties of elementary particles as a consequence of an inner structure, in contrast to the approach of the standard model, where the properties are ascribed to point-like particles. Based on the assumption of the existence of photons, the possibility of the creation of fermion and anti-fermion in an interaction of two photons of equal energy is shown. The properties of these created elementary material particles are found to agree with the ones observed. Also the possibility of the creation of a neutron by interaction of two photons of equal energy is shown. In this case, the newly formed neutron rests in the center of the collision system as a combined system of the localized two photons. The created neutron is shown to have the known properties of the neutron, and in addition, to have a definite shape of definite size. The proton is described as the particle formed by decay of the neutron, also owning the observed properties, and in addition a definite shape. For all particles described by the Ansatz, their properties are consequences of an inner structure. The merits of the alternative description as compared to the standard model and the application of quantum mechanics are discussed. PubDate: 2022-03-18

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: To answer foundational questions in physics, physicists turn more and more to abstract advanced mathematics, even though its physical significance may not be immediately clear. What if we started to borrow ideas and approaches, with appropriate modifications, from the foundations of mathematics' In this paper we explore this route. In reverse mathematics one starts from theorems and finds the minimum set of axioms required for their derivation. In reverse physics we want to start from laws or more specific results, and find the physical concepts and starting points that recover them. We want to understand what physical results are implied by which physical assumptions. As an example of the technique, we will see six different characterizations of classical mechanics, show that the uncertainty principle depends only on the entropy bound on pure states and recast the third law of thermodynamics in terms of the entropy of an empty system. We believe the approach can provide greater insights into both current and new physical theories, put the physical concepts at the forefront of the discussion and provide a more unified view of physics by highlighting common patterns and ideas across different physical theories. PubDate: 2022-03-16

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The logical structure of quantum gravity (QG) is addressed in the framework of the so-called manifestly covariant approach. This permits to display its close analogy with the logics of quantum mechanics (QM). More precisely, in QG the conventional 2-way principle of non-contradiction (2-way PNC) holding in Classical Mechanics is shown to be replaced by a 3-way principle (3-way PNC). The third state of logical truth corresponds to quantum indeterminacy/undecidability, i.e., the occurrence of quantum observables with infinite standard deviation. The same principle coincides, incidentally, with the earlier one shown to hold in Part I, in analogous circumstances, for QM. However, this conclusion is found to apply only provided a well-defined manifestly-covariant theory of the gravitational field is adopted both at the classical and quantum levels. Such a choice is crucial. In fact it makes possible the canonical quantization of the underlying unconstrained Hamiltonian structure of general relativity, according to an approach recently developed by Cremaschini and Tessarotto (2015–2021). Remarkably, in the semiclassical limit of the theory, Classical Logic is proved to be correctly restored, together with the validity of the conventional 2-way principle. PubDate: 2022-03-14

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Distinguishability plays a major role in quantum and statistical physics. When particles are identical their wave function must be either symmetric or antisymmetric under permutations and the number of microscopic states, which determines entropy, is counted up to permutations. When the particles are distinguishable, wavefunctions have no symmetry and each permutation is a different microstate. This binary and discontinuous classification raises a few questions: one may wonder what happens if particles are almost identical, or when the property that distinguishes between them is irrelevant to the physical interactions in a given system. Here I sketch a general answer to these questions. For any pair of non-identical particles there is a timescale, \(\tau _d\) , required for a measurement to resolve the differences between them. Below \(\tau _d\) , particles seem identical, above it - different, and the uncertainty principle provides a lower bound for \(\tau _d\) . Thermal systems admit a conjugate temperature scale, \(T_d\) . Above this temperature the system appears to equilibrate before it resolves the differences between particles, below this temperature the system identifies these differences before equilibration. As the physical differences between particles decline towards zero, \(\tau _d \rightarrow \infty\) and \(T_d \rightarrow 0\) . PubDate: 2022-03-10 DOI: 10.1007/s10701-022-00557-x

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Since the emergence of computing as a mode of investigation in the sciences, computational approaches have revolutionised many fields of inquiry. Recently in philosophy, the question has begun rendering bit by bit—could computation be considered a deeper fundamental building block to all of reality' This paper proposes a continuum computing construct, predicated on a set of core computational principles: computability, discretisation, stability and optimisation. The construct is applied to the set of most fundamental physical laws, in the form of non-relativistic conservation equations which underpin our governing physics. The discretisation approach divides all of space into a mesh of discrete computational cells, and evolution of data in those cells must obey pre-defined stability conditions from established continuum computing theory, namely, the Courant–Friedrichs–Lewy condition. Evolving cell-state data in a manner which logically optimises computational efficiency, combined with the defined stability condition, the construct derives a central coupling of space, time, and the fastest speed of information propagation. This coupling formed at the lowest level by the computing construct, naturally and inherently produces aspects of special relativity and general relativity at the macroscale. This paper therefore proposes a new explanation of why the nature of space and time may be fused, and explores simple emergent congruities with relativity. The theory invites us to reverse the philosophical premise of the original Simulation Argument. This argument currently leads us to consider: how might all of the observed physics of our reality be reproduced computationally in a simulation of our existence' This work asks instead: could foundations of the physics of our universe—namely spacetime coupling—emerge inherently, and necessarily, from fundamental principles of computation' PubDate: 2022-03-08 DOI: 10.1007/s10701-022-00549-x