Authors:Francisco Botana; Tomas Recio Pages: 3 - 20 Abstract: We review the behavior of some popular dynamic geometry software when computing envelopes, relating the diverse methods implemented in these programs with the various definitions of envelope. Special attention is given to the new GeoGebra 5.0 version, that incorporates a mathematically rigorous approach for envelope computations. Furthermore, a discussion on the role, in this context, of the cooperation between GeoGebra and a recent parametric polynomial solving algorithm is detailed. This approach seems to yield accurate results, allowing for the first time sound computations of envelopes of families of plane curves in interactive environments. PubDate: 2017-05-01 DOI: 10.1007/s10472-016-9500-3 Issue No:Vol. 80, No. 1 (2017)

Authors:Uwe Egly; Martin Kronegger; Florian Lonsing; Andreas Pfandler Pages: 21 - 45 Abstract: We consider planning with uncertainty in the initial state as a case study of incremental quantified Boolean formula (QBF) solving. We report on experiments with a workflow to incrementally encode a planning instance into a sequence of QBFs. To solve this sequence of successively constructed QBFs, we use our general-purpose incremental QBF solver DepQBF. Since the generated QBFs have many clauses and variables in common, our approach avoids redundancy both in the encoding phase as well as in the solving phase. We also present experiments with incremental preprocessing techniques that are based on blocked clause elimination (QBCE). QBCE allows to eliminate certain clauses from a QBF in a satisfiability preserving way. We implemented the QBCE-based techniques in DepQBF in three variants: as preprocessing, as inprocessing (which extends preprocessing by taking into account variable assignments that were fixed by the QBF solver), and as a novel dynamic approach where QBCE is tightly integrated in the solving process. For DepQBF, experimental results show that incremental QBF solving with incremental QBCE outperforms incremental QBF solving without QBCE, which in turn outperforms nonincremental QBF solving. For the first time we report on incremental QBF solving with incremental QBCE as inprocessing. Our results are the first empirical study of incremental QBF solving in the context of planning and motivate its use in other application domains. PubDate: 2017-05-01 DOI: 10.1007/s10472-016-9501-2 Issue No:Vol. 80, No. 1 (2017)

Authors:Juana Sendra; David Gómez Sánchez-Pascuala; Valerio Morán Pages: 47 - 64 Abstract: In this paper we present two packages, implemented in the computer algebra system Maple, for dealing with offsets and conchoids to algebraic curves, respectively. Help pages and procedures are described. Also in an annex, we provide a brief atlas, created with these packages, and where the offset and the conchoid of several algebraic plane curves are obtained, their rationality is analyzed, and parametrizations are computed. Practical performance of the implemented algorithms shows that the packages execute in reasonable time; we include time cost tables of the computation of the offset and conchoid curves of two rational families of curves using the implemented packages. PubDate: 2017-05-01 DOI: 10.1007/s10472-016-9504-z Issue No:Vol. 80, No. 1 (2017)

Authors:M. Martinez; A. M. H. Abdel-Fattah; U. Krumnack; D. Gómez-Ramírez; A. Smaill; T. R. Besold; A. Pease; M. Schmidt; M. Guhe; K.-U. Kühnberger Pages: 65 - 89 Abstract: In Cognitive Science, conceptual blending has been proposed as an important cognitive mechanism that facilitates the creation of new concepts and ideas by constrained combination of available knowledge. It thereby provides a possible theoretical foundation for modeling high-level cognitive faculties such as the ability to understand, learn, and create new concepts and theories. Quite often the development of new mathematical theories and results is based on the combination of previously independent concepts, potentially even originating from distinct subareas of mathematics. Conceptual blending promises to offer a framework for modeling and re-creating this form of mathematical concept invention with computational means. This paper describes a logic-based framework which allows a formal treatment of theory blending (a subform of the general notion of conceptual blending with high relevance for applications in mathematics), discusses an interactive algorithm for blending within the framework, and provides several illustrating worked examples from mathematics. PubDate: 2017-05-01 DOI: 10.1007/s10472-016-9505-y Issue No:Vol. 80, No. 1 (2017)

Authors:Belaïd Benhamou Pages: 91 - 112 Abstract: The concept of symmetry has been extensively studied in the field of constraint programming and in the propositional satisfiability. Several methods for detection and removal of these symmetries have been developed, and their use in known solvers of these domains improved dramatically their effectiveness on a big variety of problems considered difficult to solve. The concept of symmetry may be exported to other areas where some structures can be exploited effectively. Particularly, in the area of data mining where some tasks can be expressed as constraints or logical formulas. We are interested here, by the detection and elimination of local and global symmetries in the item-set mining problem. Recent works have provided effective encodings as Boolean constraints for these data mining tasks and some idea on symmetry elimination in this area begin to appear, but still few and the techniques presented are often on global symmetry that is detected and eliminated statically in a preprocessing phase. In this work we study the notion of local symmetry and compare it to global symmetry for the itemset mining problem. We show how local symmetries of the boolean encoding can be detected dynamically and give some properties that allow to eliminate theses symmetries in SAT-based itemset mining solvers in order to enhance their efficiency. PubDate: 2017-05-01 DOI: 10.1007/s10472-016-9528-4 Issue No:Vol. 80, No. 1 (2017)

Authors:Mikko Lauri; Aino Ropponen; Risto Ritala Pages: 337 - 370 Abstract: We consider the problem of an agent traversing a directed graph with the objective of maximizing the probability of reaching a goal node before a given deadline. Only the probability of the travel times of edges is known to the agent. The agent must balance between traversal actions towards the goal, and delays due to actions improving information about graph edge travel times. We describe the relationship of the problem to the more general partially observable Markov decision process. Further, we show that if edge travel times are independent and the underlying directed graph is acyclic, a closed loop solution can be computed. The solution specifies whether to execute a traversal or information-gathering action as a function of the current node, the time remaining until the deadline, and the information about edge travel times. We present results from two case studies, quantifying the usefulness of information-gathering as opposed to applying only traversal actions. PubDate: 2017-04-01 DOI: 10.1007/s10472-016-9527-5 Issue No:Vol. 79, No. 4 (2017)

Authors:Piotr Wojciechowski; Pavlos Eirinakis; K. Subramani Pages: 371 - 392 Abstract: Erratum to: Ann Math Artif Intell (2017) 79:245-266 DOI 10.1007/s10472-016-9525-7 Owing to an error in the production process, the following article has been published incorrectly online. The article including the illustrations is presented hereafter. PubDate: 2017-04-01 Issue No:Vol. 79, No. 4 (2017)

Authors:S. Zhou; E. N. Smirnov; G. Schoenmakers; R. Peeters Abstract: Instance transfer for classification aims at boosting generalization performance of classification models for a target domain by exploiting data from a relevant source domain. Most of the instance-transfer approaches assume that the source data is relevant to the target data for the complete set of features used to represent the data. This assumption fails if the target data and source data are relevant only for strict subsets of the input features which we call “partially input-feature relevant”. In this case these approaches may result in sub-optimal classification models or even in a negative transfer. This paper proposes a new decision-tree approach to instance transfer when the source data are partially input-feature relevant to the target data. The approach selects input features for tree nodes using univariate transfer of source instances. The instance transfer is guided by a conformal test for source relevance estimation. Experimental results on real-world data sets demonstrate that the new decision-tree approach is capable of outperforming existing instance-transfer approaches, especially, when the source data are partially input-feature relevant to the target data. PubDate: 2017-06-17 DOI: 10.1007/s10472-017-9554-x

Authors:Paolo Toccaceli; Ilia Nouretdinov; Alexander Gammerman Abstract: The paper presents an application of Conformal Predictors to a chemoinformatics problem of predicting the biological activities of chemical compounds. The paper addresses some specific challenges in this domain: a large number of compounds (training examples), high-dimensionality of feature space, sparseness and a strong class imbalance. A variant of conformal predictors called Inductive Mondrian Conformal Predictor is applied to deal with these challenges. Results are presented for several non-conformity measures extracted from underlying algorithms and different kernels. A number of performance measures are used in order to demonstrate the flexibility of Inductive Mondrian Conformal Predictors in dealing with such a complex set of data. This approach allowed us to identify the most likely active compounds for a given biological target and present them in a ranking order. PubDate: 2017-06-16 DOI: 10.1007/s10472-017-9556-8

Authors:Shufeng Kong; Sanjiang Li; Yongming Li; Zhiguo Long Abstract: The study of tractable subclasses of constraint satisfaction problems is a central topic in constraint solving. Tree convex constraints are extensions of the well-known row convex constraints. Just like the latter, every path-consistent tree convex constraint network is globally consistent. However, it is NP-complete to decide whether a tree convex constraint network has solutions. This paper studies and compares three subclasses of tree convex constraints, which are called chain-, path-, and tree-preserving constraints respectively. The class of tree-preserving constraints strictly contains the subclasses of path-preserving and arc-consistent chain-preserving constraints. We prove that, when enforcing strong path-consistency on a tree-preserving constraint network, in each step, the network remains tree-preserving. This ensures the global consistency of consistent tree-preserving networks after enforcing strong path-consistency, and also guarantees the applicability of the partial path-consistency algorithms to tree-preserving constraint networks, which is usually much more efficient than the path-consistency algorithms for large sparse constraint networks. As an application, we show that the class of tree-preserving constraints is useful in solving the scene labelling problem. PubDate: 2017-05-29 DOI: 10.1007/s10472-017-9552-z

Authors:Alexander Kuleshov; Alexander Bernstein Abstract: Consider unknown smooth function which maps high-dimensional inputs to multidimensional outputs and whose domain of definition is unknown low-dimensional input manifold embedded in an ambient high-dimensional input space. Given training dataset consisting of ‘input-output’ pairs, regression on input manifold problem is to estimate the unknown function and its Jacobian matrix, as well to estimate the input manifold. By transforming high-dimensional inputs in their low-dimensional features, initial regression problem is reduced to certain regression on feature space problem. The paper presents a new geometrically motivated method for solving both interrelated regression problems. PubDate: 2017-05-16 DOI: 10.1007/s10472-017-9551-0

Authors:Ernst Ahlberg; Oscar Hammar; Claus Bendtsen; Lars Carlsson Abstract: We present two applications of conformal prediction relevant to drug discovery. The first application is around interpretation of predictions and the second one around the selection of compounds to progress in a drug discovery project setting. PubDate: 2017-04-28 DOI: 10.1007/s10472-017-9550-1

Authors:Lior Aronshtam; Havazelet Cohen; Tammar Shrot Abstract: This article focuses on the question of whether a certain candidate’s (player’s) chance to advance further in a tennis tournament can be increased when the ordering of the tournament can be controlled (manipulated by the organizers) according to his own preferences. Is it possible to increase the number of ranking points a player will receive? And most importantly, can it be done in reasonable computational time? The answers to these questions differ for different settings. e.g., the information available on the outcome of each game, the significance of the number of points gained or of the number of games won. We analyzed five different variations of these tournament questions. First the complexity hardness of trying to control the tournaments is shown. Then, the tools of parametrized complexity are used to investigate the source of the problems’ hardness. Specifically, we check whether this hardness holds when the size of the problem is bounded. The findings of this analysis show that it is possible under certain circumstances to control the tournament in favor of a specific candidate in order to help him advance further in the tournament. PubDate: 2017-04-24 DOI: 10.1007/s10472-017-9549-7

Authors:John McCabe-Dansted; Mark Reynolds Abstract: Attempts to manage the reasoning about systems with fairness properties are long running. The popular but restricted Computational Tree Logic (CTL) is amenable to automated reasoning but has difficulty expressing some fairness properties. More expressive languages such as CTL* and CTL+ are computationally complex. The main contribution of this paper is to show the usefulness and practicality of employing the bundled variants of these languages to handle fairness. In particular we present a tableau for a bundled variant of CTL that still has the similar computational complexity to the CTL tableau and a simpler implementation. We further show that the decision problem remains in EXPTIME even if a bounded number of CTL* fairness constraints are allowed in the input formulas. By abandoning limit closure the bundled logics can simultaneously be easier to automate and express many typical fairness constraints. PubDate: 2017-04-19 DOI: 10.1007/s10472-017-9546-x

Authors:Vladimir Vovk; Dusko Pavlovic Abstract: We construct universal prediction systems in the spirit of Popper’s falsifiability and Kolmogorov complexity and randomness. These prediction systems do not depend on any statistical assumptions (but under the IID assumption they dominate, to within the usual accuracy, conformal prediction). Our constructions give rise to a theory of algorithmic complexity and randomness of time containing analogues of several notions and results of the classical theory of Kolmogorov complexity and randomness. PubDate: 2017-04-19 DOI: 10.1007/s10472-017-9547-9

Authors:A. Zaytsev; E. Burnaev Abstract: Engineers widely use Gaussian process regression framework to construct surrogate models aimed to replace computationally expensive physical models while exploring design space. Thanks to Gaussian process properties we can use both samples generated by a high fidelity function (an expensive and accurate representation of a physical phenomenon) and a low fidelity function (a cheap and coarse approximation of the same physical phenomenon) while constructing a surrogate model. However, if samples sizes are more than few thousands of points, computational costs of the Gaussian process regression become prohibitive both in case of learning and in case of prediction calculation. We propose two approaches to circumvent this computational burden: one approach is based on the Nyström approximation of sample covariance matrices and another is based on an intelligent usage of a blackbox that can evaluate a low fidelity function on the fly at any point of a design space. We examine performance of the proposed approaches using a number of artificial and real problems, including engineering optimization of a rotating disk shape. PubDate: 2017-04-05 DOI: 10.1007/s10472-017-9545-y

Authors:Weifu Ding; Jiangshe Zhang Abstract: This paper presents a novel online object tracking algorithm with sparse representation for learning effective appearance models under a particle filtering framework. Compared with the state-of-the-art ℓ 1 sparse tracker, which simply assumes that the image pixels are corrupted by independent Gaussian noise, our proposed method is based on information theoretical Learning and is much less sensitive to corruptions; it achieves this by assigning small weights to occluded pixels and outliers. The most appealing aspect of this approach is that it can yield robust estimations without using the trivial templates adopted by the previous sparse tracker. By using a weighted linear least squares with non-negativity constraints at each iteration, a sparse representation of the target candidate is learned; to further improve the tracking performance, target templates are dynamically updated to capture appearance changes. In our template update mechanism, the similarity between the templates and the target candidates is measured by the earth movers’ distance(EMD). Using the largest open benchmark for visual tracking, we empirically compare two ensemble methods constructed from six state-of-the-art trackers, against the individual trackers. The proposed tracking algorithm runs in real-time, and using challenging sequences performs favorably in terms of efficiency, accuracy and robustness against state-of-the-art algorithms. PubDate: 2017-03-23 DOI: 10.1007/s10472-017-9543-0

Authors:Federico Bergenti; Stefania Monica Abstract: This paper describes an algorithm to enforce hyper-arc consistency of polynomial constraints defined over finite domains. First, the paper describes the language of so called polynomial constraints over finite domains, and it introduces a canonical form for such constraints. Then, the canonical form is used to transform the problem of testing the satisfiability of a constraint in a box into the problem of studying the sign of a related polynomial function in the same box, a problem which is effectively solved by using the modified Bernstein form of polynomials. The modified Bernstein form of polynomials is briefly discussed, and the proposed hyper-arc consistency algorithm is finally detailed. The proposed algorithm is a subdivision procedure which, starting from an initial approximation of the domains of variables, removes values from domains to enforce hyper-arc consistency. PubDate: 2017-03-20 DOI: 10.1007/s10472-017-9544-z