Abstract: Publication date: Available online 23 September 2016 Source:Journal of Applied Logic Author(s): John N. Martin This paper addresses the degree to which The Port Royal Logic anticipates Boolean Algebra. According to Marc Dominicy the best reconstruction is a Boolean Algebra of Carnapian properties, functions from possible worlds to extensions. Sylvain Auroux's reconstruction approximates a non-complemented bounded lattice. This paper argues that it is anachronistic to read lattice algebra into the Port Royal Logic. It is true that the Logic treats extensions like sets, orders ideas under a containment relation, and posits mental operations of abstraction and restriction. It also orders species in a version of the tree of Porphyry, and allows that genera may be divided into species by privative negation. There is, however, no maximal or minimal idea. ion is not binary. Neither abstraction nor restriction is closed. Ideas under containment, therefore, do not form a lattice. Nor are the relevant formal properties of lattices discussed. Term negation is privative, not a complementation operation. The technical ideas relevant to the discussion are defined. The Logic's purpose in describing the structure was not to develop algebra in the modern sense but rather to provide a new basis for the semantics of mental language consistent with Cartesian metaphysics. The account was not algebraic, but metaphysical and psychological, based on the concept of comprehension, a Cartesian version of medieval objective being.

Abstract: Publication date: Available online 5 September 2016 Source:Journal of Applied Logic Author(s): Aritra Hazra, Pallab Dasgupta, Partha Pratim Chakrabarti Reliability has become an integral component of the design intent of embedded cyber-physical systems. Safety-critical embedded systems are designed with specific reliability targets, and design practices include the appropriate allocation of both spatial and temporal redundancies in the implementation to meet such requirements. With increasing complexity of such systems and considering the large number of components in such systems, redundancy allocation requires a formal scientific basis. In this work, we profess the analysis of the redundancy requirement upfront with the objective of making it an integral part of the specification. The underlying problem is one of synthesizing a formal specification with built-in redundancy artifacts, from the formal properties of the error-free system, the error probabilities of the control components, and the reliability target. We believe that upfront formal analysis of redundancy requirements is important in budgeting the resource requirements from a cost versus reliability perspective. Several case-studies from the automotive domain highlight the efficacy of our proposal.

Abstract: Publication date: Available online 3 August 2016 Source:Journal of Applied Logic Author(s): Justine Jacot, Emmanuel Genot, Frank Zenker This article demonstrates that typical restrictions which are imposed in dialogical logic in order to recover first-order logical consequence from a fragment of natural language argumentation are also forthcoming from preference profiles of boundedly rational players, provided that these players instantiate a specific player type and compute partial strategies. We present two structural rules, which are formulated similarly to closure rules for tableaux proofs that restrict players' strategies to a mapping between games in extensive forms (i.e., game trees) and proof trees. Both rules are motivated from players' preferences and limitations; they can therefore be viewed as being player-self-imposable. First-order logical consequence is thus shown to result from playing a specific type of argumentation game. The alignment of such games with the normative model of the Pragma-dialectical theory of argumentation is positively evaluated. But explicit rules to guarantee that the argumentation game instantiates first-order logical consequence have now become gratuitous, since their normative content arises directly from players' preferences and limitations. A similar naturalization for non-classical logics is discussed.

Abstract: Publication date: September 2016 Source:Journal of Applied Logic, Volume 17 Author(s): Nayat Sanchez-Pi, Luis Martí, Ana Cristina Bicharra Garcia Information retrieval has been widely studied due to the growing amounts of textual information available electronically. Nowadays organizations and industries are facing the challenge of organizing, analyzing and extracting knowledge from masses of unstructured information for decision making process. The development of automatic methods to produce usable structured information from unstructured text sources is extremely valuable to them. Opposed to the traditional text classification methods that need a set of well-classified trained corpus to perform efficient classification; the ontology-based classifier benefits from the domain knowledge and provides more accuracy. In a previous work we proposed and evaluated an ontology-based heuristic algorithm [28] for occupational health control process, particularly, for the case of automatic detection of accidents from unstructured texts. Our extended proposal is more domain dependent because it uses technical terms and contrast the relevance of these technical terms into the text, so the heuristic is more accurate. It divides the problem in subtasks such as: (i) text analysis, (ii) recognition and (iii) classification of failed occupational health control, resolving accidents as text analysis, recognition and classification of failed occupational health control, resolving accidents.

Abstract: Publication date: September 2016 Source:Journal of Applied Logic, Volume 17 Author(s): José Luis Casteleiro-Roca, Héctor Quintián, José Luis Calvo-Rolle, Emilio Corchado, María del Carmen Meizoso-López, Andrés Piñón-Pazos The heat pump with geothermal exchanger is one of the best methods to heat up a building. The heat exchanger is an element with high probability of failure due to the fact that it is an outside construction and also due to its size. In the present study, a novel intelligent system was designed to detect faults on this type of heating equipment. The novel approach has been successfully empirically tested under a real dataset obtained during measurements of one year. It was based on classification techniques with the aim of detecting failures in real time. Then, the model was validated and verified over the building; it obtained good results in all the operating conditions ranges.

Abstract: Publication date: September 2016 Source:Journal of Applied Logic, Volume 17 Author(s): J. del Sagrado, J.A. Sánchez, F. Rodríguez, M. Berenguel Greenhouse crop production is directly influenced by climate conditions. A Bayesian network is introduced in this paper aimed at achieving adequate inside climate conditions (mainly temperature and humidity) by acting on actuators based on the value of different state variables and disturbances acting on the system. The system is built and tested using data gathered from a real greenhouse under closed-loop control (where several controllers as gain scheduling ones are used), but where growers can also perform control actions independent on the automatic control system. The Bayesian Network has demonstrated to provide a good approximation of a control signal based on previous manual and control actions implemented in the same system (based on predefined setpoints), as well as on the environmental conditions. The results thus show the performance and applicability of Bayesian networks within climate control framework.

Abstract: Publication date: September 2016 Source:Journal of Applied Logic, Volume 17 Author(s): Joanna Golińska-Pilarek, Emilio Muñoz-Velasco, Angel Mora We present a new prefixed tableau system TK for verification of validity in modal logic K . The system TK is deterministic, it uniquely generates exactly one proof tree for each clausal representation of formulas, and, moreover, it uses some syntactic reductions of prefixes. TK is defined in the original methodology of tableau systems, without any external technique such as backtracking, backjumping, etc. Since all the necessary bookkeeping is built into the rules, the system is not only a basis for a validity algorithm, but is itself a decision procedure. We present also a deterministic tableau decision procedure which is an extension of TK and can be used for the global assumptions problem.

Abstract: Publication date: September 2016 Source:Journal of Applied Logic, Volume 17 Author(s): Álvaro Herrero, Bruno Baruque, Fanny Klett, Ajith Abraham, Václav Snášel, André C.P.L.F. de Carvalho, Pablo García Bringas, Ivan Zelinka, Héctor Quintián, Juan Manuel Corchado, Emilio Corchado

Abstract: Publication date: September 2016 Source:Journal of Applied Logic, Volume 17 Author(s): Pedro Luis Galdámez, Angélica González Arrieta, Miguel Ramón Ramón The purpose of this document is to offer a combined approach in biometric analysis field, integrating some of the most known techniques using ears to recognize people. This study uses Hausdorff distance as a pre-processing stage adding sturdiness to increase the performance filtering for the subjects to use it in the testing process. Also includes the Image Ray Transform (IRT) and the Haar based classifier for the detection step. Then, the system computes Speeded Up Robust Features (SURF) and Linear Discriminant Analysis (LDA) as an input of two neural networks to recognize a person by the patterns of its ear. To show the applied theory experimental results, the above algorithms have been implemented using Microsoft C#. The investigation results showed robustness improving the ear recognition process.

Abstract: Publication date: Available online 1 July 2016 Source:Journal of Applied Logic Author(s): A.J. Goodall, J. Nešetřil, P. Ossona de Mendez A strongly polynomial sequence of graphs ( G n ) is a sequence ( G n ) n ∈ N of finite graphs such that, for every graph F, the number of homomorphisms from F to G n is a fixed polynomial function of n (depending on F). For example, ( K n ) is strongly polynomial since the number of homomorphisms from F to K n is the chromatic polynomial of F evaluated at n. In earlier work of de la Harpe and Jaeger, and more recently of Averbouch, Garijo, Godlin, Goodall, Makowsky, Nešetřil, Tittmann, Zilber and others, various examples of strongly polynomial sequences and constructions for families of such sequences have been found, leading to analogues of the chromatic polynomial for fractional colourings and acyclic colourings, to choose two interesting examples. We give a new model-theoretic method of constructing strongly polynomial sequences of graphs that uses interpretation schemes of graphs in more general relational structures. This surprisingly easy yet general method encompasses all previous constructions and produces many more. We conjecture that, under mild assumptions, all strongly polynomial sequences of graphs can be produced by the general method of quantifier-free interpretation of graphs in certain basic relational structures (essentially disjoint unions of transitive tournaments with added unary relations). We verify this conjecture for strongly polynomial sequences of graphs with uniformly bounded degree.

Abstract: Publication date: Available online 15 June 2016 Source:Journal of Applied Logic Author(s): Waqar Ahmed, Osman Hasan, Sofiène Tahar Reliability Block Diagrams (RBDs) allow us to model the failure relationships of complex systems and their sub-components and are extensively used for system reliability, availability and maintainability analyses. Traditionally, these RBD-based analyses are done using paper-and-pencil proofs or computer simulations, which cannot ascertain absolute correctness due to their inaccuracy limitations. As a complementary approach, we propose to use the higher-order logic theorem prover HOL to conduct RBD-based analysis. For this purpose, we present a higher-order logic formalization of commonly used RBD configurations, such as series, parallel, parallel-series and series-parallel, and the formal verification of their equivalent mathematical expressions. A distinguishing feature of the proposed RBD formalization is the ability to model nested RBD configurations, which are RBDs having blocks that also represent RBD configurations. This generality allows us to formally analyze the reliability of many real-world systems. For illustration purposes, we formally analyze the reliability of a generic Virtual Data Center (VDC) in a cloud computing infrastructure exhibiting the nested series-parallel RBD configuration.

Abstract: Publication date: Available online 16 June 2016 Source:Journal of Applied Logic Author(s): Fairouz Kamareddine, Jonathan P. Seldin, J.B. Wells There are two versions of type assignment in the λ-calculus: Church-style, in which the type of each variable is fixed, and Curry-style (also called “domain free”), in which it is not. As an example, in Church-style typing, λ x : A . x is the identity function on type A, and it has type A → A but not B → B for a type B different from A. In Curry-style typing, λ x . x is a general identity function with type C → C for every type C. In this paper, we will show how to interpret in a Curry-style system every Pure Type System (PTS) in the Church-style without losing any typing information. We will also prove a kind of conservative extension result for this interpretation, a result which implies that for most consistent PTSs of the Church-style, the corresponding Curry-style system is consistent. We will then show how to interpret in a system of the Church-style (a modified PTS, stronger than a PTS) every PTS-like system in the Curry style.

Abstract: Publication date: November 2016 Source:Journal of Applied Logic, Volume 18 Author(s): Shokoofeh Ghorbani In this paper, we introduce hoop twist-structure whose members are built as special squares of an arbitrary hoop. We show how our construction relates to eN4-lattices (N4-lattices) and implicative twist-structures. We prove that hoop twist-structures form a quasi-variety and characterize the AHT-congruences of each algebra in this quasi-variety in terms of the congruences of the associated hoop.

Abstract: Publication date: July 2016 Source:Journal of Applied Logic, Volume 16 Author(s): K. Lopez-de-Ipiña, A. Bergareche, P. de la Riva, M. Faundez-Zanuy, P.M. Calvo, J. Roure, E. Sesa-Nogueras Essential tremor (ET) in the western world is the most common movement disorder, and 50–70% of essential tremor cases are estimated to be genetic in origin [14]. This work on selection of nonlinear biomarkers derived from drawings and handwriting is part of a wider cross-study for the diagnosis of essential tremor led by Biodonostia Institute. These biomarkers include not only classic linear features, but also non-linear: fractal dimension and entropy. The presence of integrated features of other diseases such as stress is also analyzed. In future works, these new biomarkers will be integrated with the ones obtained in the wider study of Biodonostia. Note that the use of these methods provide undoubted benefits towards the development of more sustainable, low-cost, high-quality, and non-invasive technologies. These systems are easily adaptable to the user and environment, and can be very useful in real complex environments with regard to a social and economic point of view.

Abstract: Publication date: Available online 10 May 2016 Source:Journal of Applied Logic Author(s): Paul D. Thorn, Gerhard Schurz In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to yield probability distributions with moderately high entropy levels. In the present article, we present data charting the performance of the four systems when reasoning in environments of various entropy levels. The results illustrate variations in the performance of the respective reasoning systems that derive from the entropy of the environment, and allow for a more inclusive assessment of the reliability and robustness of the four systems.

Abstract: Publication date: Available online 10 May 2016 Source:Journal of Applied Logic Author(s): Nico Potyka, Engelbert Mittermeier, David Marenke The expert system shell MECore provides a series of knowledge management operations to define probabilistic knowledge bases and to reason under uncertainty. To provide a reference work for MECore algorithmics, we bring together results from different sources that have been applied in MECore and explain their intuitive ideas. Additionally, we report on our ongoing work regarding further development of MECore's algorithms to compute optimum entropy distributions and provide some empirical results. Altogether this paper explains the intuition of important theoretical results and their practical implications, compares old and new algorithmic approaches and points out their benefits as well as possible limitations and pitfalls.

Abstract: Publication date: Available online 10 May 2016 Source:Journal of Applied Logic Author(s): Achim Kuwertz, Jürgen Beyerer Adaptive knowledge modeling is an approach for extending the abilities of the Object-Oriented World Model, a system for representing the state of an observed real-world environment, to open-world modeling. In open environments, entities unforeseen at the design-time of a world model can occur. For coping with such circumstances, adaptive knowledge modeling is tasked with adapting the underlying knowledge model according to the environment. The approach is based on quantitative measures, introduced previously, for rating the quality of knowledge models. In this contribution, adaptive knowledge modeling is extended by measures for detecting the need for model adaptation and identifying the potential starting points of necessary model change as well as by an approach for applying such change. Being an extended and more detailed version of [1], the contribution also provides background information on the architecture of the Object-Oriented World Model and on the principles of adaptive knowledge modeling, as well as examination results for the proposed methods. In addition, a more complex scenario is used to evaluate the overall approach.

Abstract: Publication date: Available online 11 May 2016 Source:Journal of Applied Logic Author(s): Lieven Decock, Igor Douven, Marta Sznajder That one's degrees of belief at any one time obey the axioms of probability theory is widely regarded as a necessary condition for static rationality. Many theorists hold that it is also a sufficient condition, but according to critics this yields too subjective an account of static rationality. However, there are currently no good proposals as to how to obtain a tenable stronger probabilistic theory of static rationality. In particular, the idea that one might achieve the desired strengthening by adding some symmetry principle to the probability axioms has appeared hard to maintain. Starting from an idea of Carnap and drawing on relatively recent work in cognitive science, this paper argues that conceptual spaces provide the tools to devise an objective probabilistic account of static rationality. Specifically, we propose a principle that derives prior degrees of belief from the geometrical structure of concepts.

Abstract: Publication date: Available online 27 April 2016 Source:Journal of Applied Logic Author(s): Gerhard Jäger, Michel Marti Starting off from the usual language of modal logic for multi-agent systems dealing with the agents' knowledge/belief and common knowledge/belief we define so-called epistemic Kripke structures for intuitionistic (common) knowledge/belief. Then we introduce corresponding deductive systems and show that they are sound and complete with respect to these semantics.

Abstract: Publication date: Available online 25 April 2016 Source:Journal of Applied Logic Author(s): Wenyan Xu Cirquent calculus is a new proof-theoretic and semantic approach introduced by G.Japaridze for the needs of his theory of computability logic (CoL). The earlier article “From formulas to cirquents in computability logic” by Japaridze generalized formulas in CoL to circuit-style structures termed cirquents. It showed that, through cirquents with what are termed clustering and ranking, one can capture, refine and generalize independence-friendly (IF) logic. Specifically, the approach allows us to account for independence from propositional connectives in the same spirit as IF logic accounts for independence from quantifiers. Japaridze's treatment of IF logic, however, was purely semantical, and no deductive system was proposed. The present paper syntactically constructs a cirquent calculus system with clustering and ranking, sound and complete w.r.t. the propositional fragment of cirquent-based semantics. Such a system captures the propositional version of what is called extended IF logic, thus being an axiomatization of a nontrivial fragment of that logic.

Abstract: Publication date: Available online 2 March 2016 Source:Journal of Applied Logic Author(s): J.A. Bergstra, C.A. Middelburg Meadows are alternatives for fields with a purely equational axiomatization. At the basis of meadows lies the decision to make the multiplicative inverse operation total by imposing that the multiplicative inverse of zero is zero. Divisive meadows are meadows with the multiplicative inverse operation replaced by a division operation. Viewing a fraction as a term over the signature of divisive meadows that is of the form p / q , we investigate which divisive meadows admit transformation of fractions into simple fractions, i.e. fractions without proper subterms that are fractions.

Abstract: Publication date: Available online 21 February 2016 Source:Journal of Applied Logic Author(s): Mauricio Osorio Galindo, Verónica Borja Macías, José Ramón Enrique Arrazola Ramírez In [25] Priest developed the da Costa logic (daC); this is a paraconsistent logic which is also a co-intuitionistic logic that contains the logic C ω . Due to its interesting properties it has been studied by Castiglioni, Ertola and Ferguson, and some remarkable results about it and its extensions are shown in [8,11]. In the present article we continue the study of daC, we prove that a restricted Hilbert system for daC, named DC, satisfies certain properties that help us show that this logic is not a maximal paraconsistent system. We also study an extension of daC called P H 1 and we give different characterizations of it. Finally we compare daC and P H 1 with several paraconsistent logics.

Abstract: Publication date: Available online 4 March 2016 Source:Journal of Applied Logic Author(s): Sergey Babenyshev, Manuel A. Martins This work advances a research agenda which has as its main aim the application of Algebraic Logic (AAL) methods and tools to the specification and verification of software systems. It uses a generalization of the notion of an abstract deductive system to handle multi-sorted deductive systems which differentiate visible and hidden sorts. Two main results of the paper are obtained by generalizing properties of the Leibniz congruence — the central notion in AAL. In this paper we discuss a question we posed in [1] about the relationship between the behavioral equivalences of equivalent hidden logics. We also present a necessary and sufficient intrinsic condition for two hidden logics to be equivalent.

Abstract: Publication date: July 2016 Source:Journal of Applied Logic, Volume 16 Author(s): Albert Kadji, Celestin Lele, Jean B. Nganou The goal of the present article is to extend the study of commutative rings whose ideals form an MV-algebra as carried out by Belluce and Di Nola [1] to non-commutative rings. We study and characterize all rings whose ideals form a pseudo MV-algebra, which shall be called here generalized Łukasiewicz rings. We obtain that these are (up to isomorphism) exactly the direct sums of unitary special primary rings.

Abstract: Publication date: Available online 10 March 2016 Source:Journal of Applied Logic Author(s): Rob Egrot A poset is representable if it can be embedded in a field of sets in such a way that existing finite meets and joins become intersections and unions respectively (we say finite meets and joins are preserved). More generally, for cardinals α and β a poset is said to be ( α , β ) -representable if an embedding into a field of sets exists that preserves meets of sets smaller than α and joins of sets smaller than β. We show using an ultraproduct/ultraroot argument that when 2 ≤ α , β ≤ ω the class of ( α , β ) -representable posets is elementary, but does not have a finite axiomatization in the case where either α or β = ω . We also show that the classes of posets with representations preserving either countable or all meets and joins are pseudoelementary.

Abstract: Publication date: Available online 6 April 2016 Source:Journal of Applied Logic Author(s): Philippe Balbiani We consider the binary relations of negligibility, comparability and proximity in the set of all hyperreals. Associating with negligibility, comparability and proximity the binary predicates N, C and P and the connectives [ N ] , [ C ] and [ P ] , we consider a first-order theory based on these predicates and a modal logic based on these connectives. We investigate the axiomatization/completeness and the decidability/complexity of this first-order theory and this modal logic.

Abstract: Publication date: May 2016 Source:Journal of Applied Logic, Volume 15 Author(s): Norihiro Kamide It is known that the logic BI of bunched implications is a logic of resources. Many studies have reported on the applications of BI to computer science. In this paper, an extension BIS of BI by adding a sequence modal operator is introduced and studied in order to formalize more fine-grained resource-sensitive reasoning. By the sequence modal operator of BIS, we can appropriately express “sequential information” in resource-sensitive reasoning. A Gentzen-type sequent calculus SBIS for BIS is introduced, and the cut-elimination and decidability theorems for SBIS are proved. An extension of the Grothendieck topological semantics for BI is introduced for BIS, and the completeness theorem with respect to this semantics is proved. The cut-elimination, decidability and completeness theorems for SBIS and BIS are proved using some theorems for embedding BIS into BI.

Abstract: Publication date: May 2016 Source:Journal of Applied Logic, Volume 15 Author(s): Robert Demolombe, Luis Fariñas del Cerro, Naji Obeid A translation technique is presented which transforms a class of First Order Logic formulas, called Restricted formulas, into ground formulas. For the formulas in this class the range of quantified variables is restricted by Domain formulas. If we have a complete knowledge of the predicates involved in the Domain formulas their extensions can be evaluated with the Relational Algebra and these extensions are used to transform universal (respectively existential) quantifiers into finite conjunctions (respectively disjunctions). It is assumed that the complete knowledge is represented by Completion Axioms and Unique Name Axioms à la Reiter. These axioms involve the equality predicate. However, the translation allows to remove the equality in the ground formulas and for a large class of formulas their consequences are the same as the initial First Order formulas. This result open the door for the design of efficient deduction techniques.

Abstract: Publication date: May 2016 Source:Journal of Applied Logic, Volume 15 Author(s): Lorenzo Magnani In the companion article “The eco-cognitive model of abduction” [66] I illustrated the main features of my eco-cognitive model of abduction (EC-Model). With the aim of delineating further aspects of that “naturalization of logic” recently urged by John Woods [94] I will now set out to further analyze some properties of abduction that are essential from a logical standpoint. When dealing with the so-called “inferential problem”, I will opt for the more general concepts of input and output instead of those of premisses and conclusions, and show that in this framework two consequences can be derived that help clarify basic logical aspects of abductive reasoning: 1) it is more natural to accept the “multimodal” and “context-dependent” character of the inferences involved, 2) inferences are not merely conceived of in the terms of the process leading to the “generation of an output” or to the proof of it, as in the traditional and standard view of deductive proofs, but rather, from this perspective abductive inferences can be seen as related to logical processes in which input and output fail to hold each other in an expected relation, with the solution involving the modification of inputs, not that of outputs. The chance of finding an abductive solution still appears to depend on the Aristotelian concept of “leading away” (ἀπαγωγή) I dealt with in the companion article, that is, on the starting of the application of a supplementary logic implementing an appropriate formal inference engine. An important result I will emphasize is that irrelevance and implausibility are not always offensive to reason. In addition, we cannot be sure, more broadly, that our guessed hypotheses are plausible (even if we know that looking – in advance – for plausibility is a human good and wise heuristic), indeed an implausible hypothesis can later on result plausible. In the last part of the article I will describe that if we wish to naturalize the logic of the abductive processes and its special consequence relation, we should refer to the following main aspects: “optimization of situatedness”, “maximization of changeability” of both input and output, and high “information-sensitiveness”. Furthermore, I will point out that a logic of abduction must acknowledge the importance of keeping record of the “past life” of abductive inferential praxes, contrarily to the fact that traditional demonstrative ideal systems are prototypically characterized by what I call “maximization of memorylessness”.

Abstract: Publication date: May 2016 Source:Journal of Applied Logic, Volume 15 Author(s): Amnon Rosenmann We present a novel approach, which is based on multiple-valued logic (MVL), to the verification and analysis of digital hardware designs, which extends the common ternary or quaternary approaches for simulations. The simulations which are performed in the more informative MVL setting reveal details which are either invisible or harder to detect through binary or ternary simulations. In equivalence verification, detecting different behavior under MVL simulations may lead to the discovery of a genuine binary non-equivalence or to a qualitative gap between two designs. The value of a variable in a simulation may hold information about its degree of truth and its “place of birth” and “date of birth”. Applications include equivalence verification, initialization, assertions generation and verification, partial control on the flow of data by prioritizing and block-oriented simulations. Much of the paper is devoted to theoretical aspects behind the MVL approach, including the reason for choosing a specific algebra for computations and the introduction of the notions of De Morgan Canonical Form and of verification complexity of Boolean expressions. Two basic simulation-based algorithms are presented, one for satisfying and verifying combinational designs and the other for equivalence verification of sequential designs.

Abstract: Publication date: May 2016 Source:Journal of Applied Logic, Volume 15 Author(s): B.O. Akinkunmi Logical theories have been developed which have allowed temporal reasoning about eventualities (à la Galton) such as states, processes, actions, events and complex eventualities such as sequences and recurrences of other eventualities. This paper presents the problem of coincidence within the framework of a first order logical theory formalizing temporal multiple recurrence of two sequences of fixed duration eventualities and presents a solution to it. The coincidence problem is described as: if two complex eventualities (or eventuality sequences) consisting respectively of component eventualities x 0 , x 1 , … , x r and y 0 , y 1 , … , y s both recur over an interval k and all eventualities are of fixed durations, is there a subinterval of k over which the incidence x t and y u for 0 ≤ t ≤ r and 0 ≤ u ≤ s coincide? The solution presented here formalizes the intuition that a solution can be found by temporal projection over a cycle of the multiple recurrence of both sequences.

Abstract: Publication date: May 2016 Source:Journal of Applied Logic, Volume 15 Author(s): Tahel Ronel, Alena Vencovská We investigate the notion of a signature in Polyadic Inductive Logic and study the probability functions satisfying the Principle of Signature Exchangeability. We prove a representation theorem for such functions on binary languages and show that they satisfy a binary version of the Principle of Instantial Relevance. We discuss polyadic versions of the Principle of Instantial Relevance and Johnson's Sufficientness Postulate.

Abstract: Publication date: May 2016 Source:Journal of Applied Logic, Volume 15 Author(s): José Luis Castiglioni, Hernán Javier San Martín Let us write ℓ G u f for the category whose objects are lattice-ordered abelian groups (l-groups for short) with a strong unit and finite prime spectrum endowed with a collection of Archimedean elements, one for each prime l-ideal, which satisfy certain properties, and whose arrows are l-homomorphisms with additional structure. In this paper we show that a functor which assigns to each object ( A , u ˆ ) ∈ ℓ G u f the prime spectrum of A, and to each arrow f : ( A , u ˆ ) → ( B , v ˆ ) ∈ ℓ G u f the naturally induced p-morphism, has a left adjoint.