Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Inspired by Fisher’s geometric approach to study beneficial mutations, we analyse probabilities of beneficial mutation and crossover recombination of strings in a general Hamming space with arbitrary finite alphabet. Mutations and recombinations that reduce the distance to an optimum are considered as beneficial. Geometric and combinatorial analysis is used to derive closed-form expressions for transition probabilities between spheres around an optimum giving a complete description of Markov evolution of distances from an optimum over multiple generations. This paves the way for optimization of parameters of mutation and recombination operators. Here we derive optimality conditions for mutation and recombination radii maximizing the probabilities of mutation and crossover into the optimum. The analysis highlights important differences between these evolutionary operators. While mutation can potentially reach any part of the search space, the probability of beneficial mutation decreases with distance to an optimum, and the optimal mutation radius or rate should also decrease resulting in a slow-down of evolution near the optimum. Crossover recombination, on the other hand, acts in a subspace of the search space defined by the current population of strings. However, probabilities of beneficial and deleterious crossover are balanced, and their characteristics, such as variance, are translation invariant in a Hamming space, suggesting that recombination may complement mutation and boost the rate of evolution near the optimum. PubDate: 2025-06-23
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Detecting and exploiting similarities between seemingly distant objects is without doubt an important human ability. This paper develops from the ground up an abstract algebraic and qualitative notion of similarity based on the observation that sets of generalizations encode important properties of elements. We show that similarity defined in this way has appealing mathematical properties. As we construct our notion of similarity from first principles using only elementary concepts of universal algebra, to convince the reader of its plausibility, we show that it can model fundamental relations occurring in mathematics and can be naturally embedded into first-order logic via model-theoretic types. Finally, we sketch some potential applications to theoretical computer science and artificial intelligence. PubDate: 2025-06-20
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: We study the set of optimal solutions of the dual linear programming formulation of the linear assignment problem (LAP) to propose a method for computing a solution from the relative interior of this set. Assuming that an arbitrary dual-optimal solution and an optimal assignment are available (for which many efficient algorithms already exist), our method computes a relative-interior solution in linear time. Since the LAP occurs as a subproblem in the linear programming (LP) relaxation of the quadratic assignment problem (QAP), we employ our method as a new component in the family of dual-ascent algorithms that provide bounds on the optimal value of the QAP. To make our results applicable to the incomplete QAP, which is of interest in practical use-cases, we also provide a linear-time reduction from the incomplete LAP to the complete LAP along with a mapping that preserves optimality and membership in the relative interior. Our experiments on publicly available benchmarks indicate that our approach with relative-interior solution can frequently provide bounds near the optimum of the LP relaxation and its runtime is much lower when compared to a commercial LP solver. PubDate: 2025-05-21
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: We elaborate, as a benchmark, a list of geometry theorems that are regularly used in Hungarian high schools, in specialized tracks for mathematics. Then, these statements are implemented as GeoGebra files, and algebraically proven (with certification) in GeoGebra Discovery. Next, these files are imported in Java Geometry Expert (JGEX), and geometrically proven with the associated Geometry Deductive Database method. In both programs, through quite different approaches in each case, we define and implement an algorithmic measure of the difficulty of a geometric theorem. Then, we compute the corresponding measures on the selected list of statements, and we analyze the obtained collection of pairs of difficulty grades, finding a moderate positive correlation (0.51) between GeoGebra Discovery’s algebraic, and JGEX’s deductive difficulty formulations. A correlation that seems to support the adequacy of the chosen approximation in our on-going work to algorithmically establish some ad-hoc measure of the difficulty of geometric statements that could be reasonably coincidental with the appreciation of human expert users (e.g. teachers). PubDate: 2025-05-19
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: In this study, we have chosen the computer network with the shape of a king’s graph. The king’s graph G is defined as a set of edges, that is $$E=\{((i,j),(p,q)) i,p \in [0,M], j,q \in [0,N], M,N \in \textbf{Z},((i,j),(p,q))~{is\, an\, edge}\,\iff i = p \quad {and} \quad j = q\pm 1 \quad {or} \quad i = p\pm 1 \quad {and} \quad j = q \quad {or} \quad i = p\pm 1 \quad {and} \quad j = q\pm 1\}$$. We also set a delivery rule, in which the shortest paths in the graph are used for the message deliveries, to restrict the source consumption. Then, the paths are encoded in a way that we discover using binary arrays based on other well-known encoding methods. We prove that the path-coding method we present prevents errors denoted by false positives from the graph. Data transfer issues from computer science served as the motivation for this study. PubDate: 2025-05-15
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: System W is an inference method for conditional belief bases with some notable properties like capturing system Z while, in contrast to system Z, avoiding the drowning problem. This paper further investigates the properties of system W. We show how system W behaves with respect to postulates put forward for inference relations. We develop postulates ensuring compliance with syntax splitting for inference operators based on a full strict partial order on worlds. By observing that system W satisfies these axioms, it is proven that system W satisfies syntax splitting. We also explore how syntax splitting affects the strict partial order underlying system W and exploit this for answering certain types of queries without having to determine this strict partial order completely. However, the original definition of system W and the results above only consider inference from belief bases satisfying a strong notion of consistency. In the second part of this paper, we lift this limitation and extend system W to also cover inference from belief bases that only satisfy a weaker notion of consistency. We investigate the properties of the such extended system W. Especially, it is shown that extended system W complies with syntax splitting and retains the desireable properties of system W. Furthermore, we give an overview of the relations of extended system W to other inductive inference operators. PubDate: 2025-05-15
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: An agent-based modelling approach is a powerful means of understanding social phenomena by modelling individual behaviours and interactions. However, the advancements in modelling pose challenges in the model analysis process for understanding the complex effects of input factors, especially when it comes to offering concrete policies for improving system outcomes. In this work, we propose a revised micro-dynamic analysis method that adopts pattern mining and causal discovery methods to enhance the model interpretation and to facilitate group-specific policy-making. It strengthens the explanation power of the conventional micro-dynamic analysis by eliminating ambiguity in the result interpretation and enabling a causal interpretation of a target phenomenon across subgroups. We applied our method to understand an agent-based model that evaluates the effects of a long-term care scheme on access to care. Our findings showed that the method can suggest policies for improving the equity of access more efficiently than the conventional scenario analysis. PubDate: 2025-05-08
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: In this paper, we present an Isabelle/HOL formalization of noncommutative and nonassociative algebraic structures known as gyrogroups and gyrovector spaces. These concepts were introduced by Abraham A. Ungar and have deep connections to hyperbolic geometry and special relativity. Gyrovector spaces can be used to define models of hyperbolic geometry. Unlike other models, gyrovector spaces offer the advantage that all definitions exhibit remarkable syntactical similarities to standard Euclidean and Cartesian geometry (e.g., points on the line between a and b satisfy the parametric equation $$ a \oplus t\otimes (\ominus a \oplus b)$$, for $$t \in \mathbb {R}$$, while the hyperbolic Pythagorean theorem is expressed as $$a^2\oplus b^2 = c^2$$, where $$\otimes $$, $$\oplus $$, and $$\ominus $$ represent gyro operations). We begin by formally defining gyrogroups and gyrovector spaces and proving their numerous properties. Next, we formalize Möbius and Einstein models of these abstract structures (formulated in the two-dimensional, complex plane), and then demonstrate that these are equivalent to the Poincaré and Klein-Beltrami models, satisfying Tarski’s geometry axioms for hyperbolic geometry. PubDate: 2025-04-26
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Logical Analysis of Data (LAD) is a powerful technique for data classification based on partially defined Boolean functions. The decision rules for class prediction in LAD are formed out of patterns. According to different preferences in the classification problem, various pattern types have been defined. The generation of these patterns plays a key role in the LAD methodology and represents a computationally hard problem. In this article, we introduce a new approach to pattern generation in LAD based on Answer Set Programming (ASP), which can be applied to all common LAD pattern types. PubDate: 2025-04-16
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: We present an approach for automated theorem proving in 3D solid geometry utilizing multiple algebraic prover engines. Our solution integrates dynamic geometry systems capable of generating solid geometry constructions. We have employed two different methods to transform geometric statements into algebraic representations, while implementing and evaluating these approaches with various theorem provers. Furthermore, we have explored non-degeneracy conditions (NDG) in the context of 3D solid geometry and provided insights into their role in ensuring the validity of geometric relations. PubDate: 2025-04-01
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Amid the global COVID-19 pandemic, developing effective drugs to combat Coronavirus Disease (COVID-19) has become crucial. However, identifying potential side effects of these drugs remains a significant challenge in the pursuit of effective treatments. Recent advancements in computational models for pharmaceutical development have opened new possibilities for detecting such side effects. In response to the urgent need for effective COVID-19 drugs, this research introduces a novel methodology combining multi-label InceptionV3 and Bidirectional Long Short-Term Memory (IncV3-BLSTM). The experimental evaluations utilize chemical conformers derived from the stick structures of COVID-19 drugs. These conformers’ distinctive features are represented through RGB color channels, with feature extraction performed using InceptionV3, GlobalAveragePooling2D, and BLSTM layers. The results demonstrate the superior efficiency of the IncV3-BLSTM model, outperforming previous studies. Notably, the proposed model achieves a peak accuracy of 97.50% and a co-occurrence potential side effects detection rate of 82.16%. This research marks a significant advancement in modeling drug side effects, particularly for COVID-19 treatments. PubDate: 2025-03-31
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: For rational parameterization of curves, it is desirable that the angular speed is made as uniform as possible. When the rational parameterization of a curve is given, the uniformity of its angular speed may be enhanced by finding a re-parameterization that achieves better uniformity, where one natural approach is to use piecewise rational reparameterization. However, this approach does not improve the situation when the angular speed of the original rational parameterization is zero at certain points on the curve. In this paper, we demonstrate that the challenge may be tackled by utilizing piecewise radical reparameterization. PubDate: 2025-03-27
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Subspace clustering, particularly multi-view subspace clustering, has become increasingly relevant in machine learning due to the proliferation of multi-view data sets. Despite significant advancements, existing multi-view subspace clustering algorithms still encounter two primary limitations. Firstly, most methods learn the affinity matrix using constraints or regularization terms and then apply spectral clustering. This approach is susceptible to noise and redundant information in the original data. Secondly, in practical applications, multi-view data is often stored across different devices, some of which may contain private information that cannot be shared. Although traditional federated learning can address this issue, this approach faces limitations in scenarios where the central server is either absent or has failed. To resolve these problems, we propose a Decentralized Federated Multi-view sparse Subspace Clustering(DFMSC) method. DFMSC introduce a decentralized approach that avoids the need for a central server, reducing the vulnerability to server failure and enhancing data privacy. Specifically, our approach integrate self-representation learning, graph structure updating, and spectral embedding learning within a decentralized framework. We enforce consistency across different views by introducing a consistency constraint, which ensures that updates are made locally while achieving a unified spectral embedding through neighbor communication. Accordingly, we propose an iterative algorithm to solve the resulting optimization problem. Experimental results on a variety of real-world multi-view datasets demonstrate the superiority of our approach. PubDate: 2025-03-25
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: How can we efficiently determine meta-parameter values for deep learning-based time-series forecasting given a time-series dataset' This paper introduces Xtune, an efficient and novel meta-parameter tuning method for deep learning-based time-series forecasting, leveraging explainable AI techniques. In particular, this study focuses on optimizing the window size for time-series forecasting. Xtune determines the optimal meta-parameter value for these methods and can also be applied to tune the window size for anomaly detection methods that utilize deep learning-based time-series forecasting. Extensive experiments on real-world datasets and forecasting methods demonstrate that Xtune efficiently identifies the optimal meta-parameter value and consistently outperforms the existing methods in terms of execution speed. PubDate: 2025-03-11
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: Although there are several systems that successfully generate construction steps for ruler and compass construction problems, none of them provides readable synthetic correctness proofs for generated constructions. In this paper, we demonstrate how our triangle construction solver ArgoTriCS can cooperate with automated theorem provers for first-order logic and coherent logic so that it generates construction correctness proofs, that are both human-readable and formal (can be checked by interactive theorem provers such as Isabelle/HOL or Coq). For this purpose we identified a set of relevant lemmas and developed a coherent logic prover GCProver customized for geometry construction problems. Our experiments show that results are much better than with general purpose theorem provers. PubDate: 2025-02-20
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Abstract: In this paper, we consider automated solving of triangle straightedge-and-compass construction problems by reducing them to automated planning. We consider the problems from the Wernick’s list, where each problem assumes that locations of three significant points of a triangle are given, and the goal is to construct all three vertices of the triangle. We develop two different models of the corresponding planning problem. In the first model, the planning problem is described using the PDDL language, which is suitable for solving with dedicated automated planners. The second model assumes that the planning problem is first expressed as a finite-domain constraint satisfaction problem, and then encoded in the MiniZinc language. Such model is suitable for solving with constraint solvers. In both cases, we employ existing artificial intelligence tools in search for a solution of our construction problem. The main benefit of using the existing tools for such purpose, instead of developing dedicated tools, is that we can rely on the efficient search that is already implemented within the tool, enabling us to focus on geometric aspects of the problem. We evaluate our approach on 74 solvable problems from the Wernick’s list, and compare it to the dedicated triangle construction solver ArgoTriCS. The results show that our approach tends to be superior to dedicated tools in terms of efficiency, while it requires much less effort to implement. Also, we are often able to find shorter construction plans, thanks to the optimization capabilities offered by the modern planners and constraint solvers. The presented approach is only a search method and does not address proving the correctness of the obtained constructions and discussing when solutions exist, leaving these tasks to other tools. Although the paper focuses on a specific set of construction problems, the approach can be generalized to other classes of problems, which will be explored in future work. PubDate: 2025-02-19