Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract The general study of the ontology of quantum field theories (QFTs) concerns whether particles or fields are more fundamental. Both views are well-motivated, although each is subject to some serious criticism. Given that the current versions of the particle interpretation and the field interpretation are not satisfying, I propose a mixed ontology of particles and fields in the framework of QFT. I argue that the ontological question should focus on how to view particles and fields consistently in QFT, provided that they are the natural candidates for the ontology of QFT. In particular, based on this reading, I adopt a functionalist reading of ontology and defend a mixed ontology of QFT. I address a paradigmatic case of the mixed ontology approach: a particle/field duality defined in terms of functional equivalence between particles and fields. Functionalism about ontology provides new insight into resolving the problem of unitarily inequivalent representations, which is one of the major interpretational issues of QFT. PubDate: 2024-07-27
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract The tradition of natural kinds has shaped philosophical debates about scientific classification but has come under growing criticism. Responding to this criticism, Reydon and Ereshefsky present their grounded functionality account as a strategy for updating and defending the tradition of natural kinds. This article argues that grounded functionality does indeed provide a fruitful philosophical approach to scientific classification but does not convince as a general theory of natural kinds. Instead, the strengths and limitations of Reydon and Ereshefsky’s account illustrate why it is time to move beyond general definitions of “natural kind” and experiment with new philosophical frameworks. PubDate: 2024-07-27
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract Different species of realism have been proposed in the scientific and philosophical literature. Two of these species are direct realism and causal pattern realism. Direct realism is a form of perceptual realism proposed by ecological psychologists within cognitive science. Causal pattern realism has been proposed within the philosophy of model-based science. Both species are able to accommodate some of the main tenets and motivations of instrumentalism. The main aim of this paper is to explore the conceptual moves that make both direct realism and causal pattern realism tenable realist positions able to accommodate an instrumentalist stance. Such conceptual moves are (i) the rejection of veritism and (ii) the re-structuring of the phenomena of interest. We will then show that these conceptual moves are instances of the ones of a common realist genus we name pragmatist realism. PubDate: 2024-07-26
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract This article provides an epistemological assessment of climate analogue methods, with specific reference to the use of spatial analogues in the study of the future climate of target locations. Our contention is that, due to formal and conceptual inadequacies of geometrical dissimilarity metrics and the loss of relevant information, especially when reasoning from the physical to the socio-economical level, purported inferences from climate analogues of the spatial kind we consider here prove limited in a number of ways. Indeed, we formulate five outstanding problems concerning the search for best analogues, which we call the problem of non-uniqueness of the source, problem of non-uniqueness of the target, problem of average, problem of non-causal correlations and problem of inferred properties, respectively. In the face of such problems, we then offer two positive recommendations for a fruitful application of this methodology to the assessment of impact, adaptation and vulnerability studies of climate change, especially in the context of what we may prosaically dub “twin cities”. Arguably, such recommendations help decision-makers constrain the set of plausible climate analogues by integrating local knowledge relevant to the locations of interest. PubDate: 2024-07-22
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract This paper attempts to revive the epistemological discussion of scientific articles. What are their epistemic aims, and how are they achieved' We argue that scientific experimental articles are best understood as a particular kind of narrative: i.e., modernist narratives (think: Woolf, Joyce), at least in the sense that they employ many of the same techniques, including colligation and the juxtaposition of multiple perspectives. We suggest that this way of writing is necessary given the nature of modern science, but it also has specific epistemic benefits: it provides readers with an effective way to grasp the content of scientific articles which increases their understanding. On the other hand, modernist writing is vulnerable to certain kinds of epistemic abuses, which can be found instantiated in modern scientific writing as well. PubDate: 2024-07-17
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract Among the various proposals for quantum ontology, both wavefunction realists and the primitive ontologists have argued that their approach is to be preferred because it relies on intuitive notions: locality, separability and spatiotemporality. As such, these proposals should be seen as normative frameworks asserting that one should choose the fundamental ontology which preserves these intuitions, even if they disagree about their relative importance: wavefunction realists favor preserving locality and separability, while primitive ontologists advocate for spatiotemporality. In this paper, first I clarify the main tenets of wavefunction realism and the primitive ontology approach, arguing that seeing the latter as favoring constructive explanation makes sense of their requirement of a spatiotemporal ontology. Then I show how the aforementioned intuitive notions cannot all be kept in the quantum domain. Consequently, wavefunction realists rank locality and separability higher than spatiotemporality, while primitive ontologists do the opposite. I conclude that however, the choice of which notions to favor is not as arbitrary as it might seem. In fact, they are not independent: requiring locality and separability can soundly be justified by requiring spatiotemporality, and not the other way around. If so, the primitive ontology approach has a better justification of its intuitions than its rival wavefunction realist framework. PubDate: 2024-07-10
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract Interpretation plays a central role in using scientific models to explain natural phenomena: Meaning must be bestowed upon a model in terms of what it is and what it represents to be used for model explanations. However, it remains unclear how capacious and complex interpretation in models can be, particularly when conducted by the same group of scientists in the context of one explanatory project. This paper sheds light upon this question by examining modelling and explanatory practices related to the Olami-Feder-Christensen model of earthquakes. This case study shows that various interpretations are intricately connected in the overall meaning of a model used for model explanations. This leads to a manifold picture of interpretation, according to which scientific models are construed as networks of interconnected meanings. As scientists ponder and integrate these various interpretations, guided by locally attended epistemic interests, they achieve model explanations with layers of content, both in their explanantia and explananda. PubDate: 2024-07-05
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract In this paper I use data from interviews conducted with coral scientists to examine the socio-ecological dimensions of science, i.e. how science shapes and is shaped by the living world around it. I use two sets of ideas in particular: niche construction and socio-ecological value frameworks. Using these I offer socio-ecological criteria by which coral scientists evaluate the activities of coral science, more specifically which living systems are intended to benefit from coral science as an activity, and the motivations behind this. The overall picture I present is one of coral science as activity primarily aimed at sustaining a diverse set of living systems, including humans, other organisms, species, and ecosystems, and the social practices associated with these. The value relations between scientists and aspects of these processes dictate how they respond to shifts in the socio-ecological context coral science is embedded in, explaining why the activities associated with coral science are changing as reef ecosystems are threatened. The implication is that natural sciences more generally are entangled with a greater number of social and ecological process than is typically considered, and that shifts in the activities undertaken by scientists may be driven by ecological as well as social and epistemic processes. PubDate: 2024-06-20
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract What should we do when two conflicting ontologies are both fruitful, though their fruitfulness varies by context or location' To achieve reconciliation, it is not enough to advocate pluralism. There are many varieties of pluralism and not all pluralisms will serve equally well; some may be inconsistent, others unhelpful. This essay considers another option: local ontology. For a pair of ontologies, a local ontology consists of two claims: (1) each location enjoys a unique ontology, and (2) neither ontology is most fundamental nor most global. To argue for this view and provide an example, I develop a local ontology for two scientific ontologies: processualism and new mechanism. To further support this ontology, I argue against two varieties of pluralism: first, a pluralism based on directly unifying the assumptions of both ontologies and, second, one of allowing both ontologies to coexist within a discipline. I argue that the first option is inconsistent and the second is unhelpful. I conclude that this local ontology provides us with a consistent and fruitful account that includes elements from both mechanism and processualism. PubDate: 2024-06-19 DOI: 10.1007/s13194-024-00587-4
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract Structural realists claim that structure is preserved across instances of radical theory change, and that this preservation provides an argument in favor of realism about structure. In this paper, I use the shift from Newtonian gravity to Einstein’s general relativity as a case study for structural preservation, and I demonstrate that two prominent views of structural preservation fail to provide a solid basis for realism about structure. The case study demonstrates that (i) structural realists must be epistemically precise about the concrete structure that is being preserved, and (ii) they must provide a metaphysical account of how structure is preserved through re-interpretation in light of a new theory. Regarding (i), I describe a means of epistemic access to the unobservable that I call “thick detection” of structure, which isolates the structure that will be preserved. Regarding (ii), I argue that thickly detectable structure is preserved across theory change through a process of extracting the old structure from the new structure, much like what has been done with geometrized versions of Newtonian gravity. With these two responses in hand, the structural realist can adequately account for the preservation of structure and can provide a strong argument in favor of structural realism. PubDate: 2024-06-13 DOI: 10.1007/s13194-024-00588-3
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract In a recent paper, Firt, Hemmo and Shenker argue that Hempel’s dilemma, typically thought to primarily undermine physicalism, is generalizable and impacts mind-body dualism and many other theories equally. I challenge this view and argue that Hempel’s dilemma admits of at least two distinct construals: a general-skeptical construal, underpinned by historically driven arguments such as the pessimistic induction, and a non-skeptical construal, driven by the specific puzzles and volatility of current physics. While the general-skeptical construal applies to all changeable deep-structure theories, the non-skeptical construal primarily targets volatile theories which harbor exclusionary ambitions. As a result, dualism largely evades both construals due to the stability of theories of the mental and their lack of exclusionary ambitions. Conversely, physicalism is uniquely susceptible to both construals due to its strong commitment to deep-structure realism, inherent exclusionary ambitions, and the volatility of certain branches of fundamental physics. The paper ultimately concludes that Hempel’s dilemma is not universally problematic, but presents a unique challenge to physicalism while being relatively congenial to dualism. PubDate: 2024-06-13 DOI: 10.1007/s13194-024-00590-9
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract Artificial intelligence algorithms, fueled by continuous technological development and increased computing power, have proven effective across a variety of tasks. Concurrently, quantum computers have shown promise in solving problems beyond the reach of classical computers. These advancements have contributed to a misconception that quantum computers enable hypercomputation, sparking speculation about quantum supremacy leading to an intelligence explosion and the creation of superintelligent agents. We challenge this notion, arguing that current evidence does not support the idea that quantum technologies enable hypercomputation. Fundamental limitations on information storage within finite spaces and the accessibility of information from quantum states constrain quantum computers from surpassing the Turing computing barrier. While quantum technologies may offer exponential speed-ups in specific computing cases, there is insufficient evidence to suggest that focusing solely on quantum-related problems will lead to technological singularity and the emergence of superintelligence. Subsequently, there is no premise suggesting that general intelligence depends on quantum effects or that accelerating existing algorithms through quantum means will replicate true intelligence. We propose that if superintelligence is to be achieved, it will not be solely through quantum technologies. Instead, the attainment of superintelligence remains a conceptual challenge that humanity has yet to overcome, with quantum technologies showing no clear path toward its resolution. PubDate: 2024-06-04 DOI: 10.1007/s13194-024-00584-7
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract It is impossible to deduce the properties of a strongly emergent whole from a complete knowledge of the properties of its constituents, according to C. D. Broad, when those constituents are isolated from the whole or when they are constituents of other wholes. Elanor Taylor proposes the Collapse Problem. Macro-level property p supposedly emerges when its micro-level components combine in relation r. However, each component has the property that it can combine with the others in r to produce p. Broad’s nondeducibility criterion is not met. This article argues that the amount of information required for r is physically impossible. Strong Emergence does not collapse. But the Collapse Problem does. Belief in Strong Emergence is strongly warranted. Strong Emergence occurs whenever it is physically impossible to deduce how components, in a specific relation, would combine to produce a whole with p. Almost always, that is impossible. Strong Emergence is ubiquitous. Word counts: PubDate: 2024-06-04 DOI: 10.1007/s13194-024-00586-5
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract The optimism vs. pessimism debate about the historical sciences is often framed in terms of arguments about the relative importance of overdetermination vs. underdetermination of historical claims by available evidence. While the interplay between natural processes that create multiple traces of past events (thereby conducive of overdetermination) and processes that erase past information (whence underdetermination) cannot be ignored, I locate the root of the debate in the epistemic granularity, or intuitively the level of detail, that pervades any historical claim justification network. To reveal the role played by granularity, I elaborate a model of historical claim justification. This model maps out the different elements that enter the justification of historical claims (incl., actual and inferred states of affairs, dating and information reconstructing theories). It also incorporates the different types of processes that affect traces of past events (information creating, preserving, modifying, and destroying processes). Granularity is shown to play a pivotal role in all elements of this model, and thereby in the inferred justification of any historical claim. As a result, while upward or downward shifts in granularity may explain changes about claims being considered as overdetermined or underdetermined, epistemic granularity constitutes an integral part of evidential reasoning in the historical sciences (and possibly elsewhere). PubDate: 2024-05-30 DOI: 10.1007/s13194-024-00583-8
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract With the present paper I maintain that the group field theory (GFT) approach to quantum gravity can help us clarify and distinguish the problems of spacetime emergence from the questions about the nature of the quanta of space. I will show that the use of approximation methods can suggest a form of indifference between scales (or phases) and that such an indifference allows us to black-box questions about the nature of the ontology of the fundamental levels of the theory. PubDate: 2024-05-22 DOI: 10.1007/s13194-024-00585-6
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract A shared narrative in the literature on the evolution of cooperation maintains that social learning evolves early to allow for the transmission of cumulative culture. Social norms, whilst present at the outset, only rise to prominence later on, mainly to stabilise cooperation against the threat of defection. In contrast, I argue that once we consider insights from social epistemology, an expansion of this narrative presents itself: An interesting kind of social norm — an epistemic coordination norm — was operative in early and important instances of specialised social learning. I show how there’s a need for such norms in two key social learning strategies and explain how this need is constituted. In assessor-teaching (e.g. Castro et al., 2019b, 2021), epistemic coordination norms allow agents to coordinate around the content of social learning, i.e., what is to be known and how this is to be done. These norms also allow agents to coordinate around the form of cultural learning in what’s sometimes called strategic social learning (Laland, 2004; Hoppitt & Laland, 2013; Heyes, 2018, Chap. 5) and elsewhere. Broadly speaking, this concerns how cultural learning is organised within the social group. The upshot is that the evolution of social learning and social norms are intertwined in important and underappreciated ways from early on. The above matters as it informs our views about the evolution of social norms more generally. Truly social norms emerged to coordinate a plurality of complex behaviours and interactions, amongst them specialised social learning. I substantiate this view by contrasting it with Jonathan Birch’s views on the evolution of norms. What results is a general but cohesive narrative on the early evolution of social norms. PubDate: 2024-05-09 DOI: 10.1007/s13194-024-00582-9
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract Reconstructions of quantum theory are a novel research program in theoretical physics which aims to uncover the unique physical features of quantum theory via axiomatization. I focus on Hardy’s “Quantum Theory from Five Reasonable Axioms” (2001), arguing that reconstructions represent a modern usage of axiomatization with significant points of continuity to von Neumann’s axiomatizations in quantum mechanics. In particular, I show that Hardy and von Neumann share similar methodological ordering, have a common operational framing, and insist on the empirical basis of axioms. In the reconstruction programme, interesting points of discontinuity with historical axiomatizations include the stipulation of a generalized space of theories represented by a framework and the stipulation of analytic machinery at two levels of generality (first by establishing a generalized mathematical framework and then by positing specific formulations of axioms). In light of the reconstruction programme, I show that we should understand axiomatization attempts as being context–dependent, context which is contingent upon the goals of inquiry and the maturity of both mathematical formalism and theoretical underpinnings within the area of inquiry. Drawing on Mitsch (2022)’s account of axiomatization, I conclude that reconstructions should best be understood as provisional, practical, representations of quantum theory that are well suited for theory development and exploration. However, I propose my context–dependent re–framing of axiomatization as a means of enriching Mitsch’s account. PubDate: 2024-04-30 DOI: 10.1007/s13194-024-00581-w
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract There is a growing concern for the proper role of science within democratic societies, which has led to the development of new science policies for the implementation of social responsibility in research. Although the very expression ‘social responsibility of science’ may be interpreted in different ways, many of these emerging policy frameworks define it, at least in part, as a form of anticipative reflection about the potential impacts of research in society. What remains a rather under-discussed issue is the definition of the bearer of the social responsibility of science. In other words, it is not clear who is supposed to engage in such an anticipative reflection, whether individual researchers or research groups. In the past few years, philosophers of science have begun to use qualitative research methods to fill the gaps between normative models of the organisation of ideal scientific communities and the reality of actual scientific practices. In this article, I follow this approach to discuss the issue of the collective dimension of the social responsibility of science. I rely on a qualitative study conducted on an interdisciplinary research group and I describe how group dynamics position individuals and distribute duties and roles, including social responsibility. Qualitative descriptions of the distribution of duties within actual research groups should inform the formulation of general prescriptive theories on the collective responsibility of science. PubDate: 2024-04-19 DOI: 10.1007/s13194-024-00580-x
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Abstract Earth system science (ESS) and modelling have given rise to a new conceptual framework in the recent decades, which goes much beyond climate science. Indeed, Earth system science and modelling have the ambition “to build a unified understanding of the Earth”, involving not only the physical Earth system components (atmosphere, cryosphere, land, ocean, lithosphere) but also all the relevant human and social processes interacting with them. This unified understanding that ESS aims to achieve raises a number of epistemological issues about interdisciplinarity. We argue that the interdisciplinary relations in ESS between natural and social / human sciences are best characterized in terms of what is called ‘scientific imperialism’ in the literature and we show that this imperialistic feature has some detrimental epistemic and non-epistemic effects, notably when addressing the issue of values in ESS. This paper considers in particular the core ESS concepts of Anthropocene, planetary boundaries and tipping points in the light of the philosophy of science discussions on interdisciplinarity and values. We show that acknowledging the interconnections between interdisciplinarity and values suggests ways for ESS to move forward in view of addressing the climate and environmental challenges. PubDate: 2024-04-06 DOI: 10.1007/s13194-024-00579-4