We investigate second-order quantifier elimination for a class of formulas characterized by a restriction on the quantifier prefix: existential predicate quantifiers followed by universal individual quantifiers and a relational matrix. For a given second-order formula of this class a possibly infinite sequence of universal first-order formulas that have increasing strength and are all entailed by the second-order formula can be constructed. Any first-order consequence of the second-order formula is a consequence of some member of the sequence. The sequence provides a recursive base for the first-order theory of the second-order formula, in the sense investigated by Craig. The restricted formula class allows to derive further properties, for example that the set of those members of the sequence that are equivalent to the second-order formula, or, more generally, have the same first-order consequences, is co-recursively enumerable. Also the set of first-order formulas that entails the second-order formula is co-recursively enumerable. These properties are proven with formula-based tools used in automated deduction, such as domain closure axioms, eliminating individual quantifiers by ground expansion, predicate quantifier elimination with Ackermann's Lemma, Craig interpolation and decidability of the Bernays-Schönfinkel-Ramsey class
Finding solution values for unknowns in Boolean equations was a principal reasoning mode in the Algebra of Logic of the 19th century. Schröder investigated it as Auflösungsproblem (solution problem). It is closely related to the modern notion of Boolean unification. Today it is commonly presented in an algebraic setting, but seems potentially useful also in knowledge representation based on predicate logic. We show that it can be modeled on the basis of first-order logic extended by second-order quantification. A wealth of classical results transfers, foundations for algorithms unfold, and connections with second-order quantifier elimination and Craig interpolation show up. Although for first-order inputs the set of solutions is recursively enumerable, the development of constructive methods remains a challenge. We identify some cases that allow constructions, most of them based on Craig interpolation, and show a method to take vocabulary restrictions on solution components into account.
Finding solution values for unknowns in Boolean equations was a principal reasoning mode in the Algebra of Logic of the 19th century. Schröder investigated it as Auflösungsproblem (solution problem). It is closely related to the modern notion of Boolean unification. Today it is commonly presented in an algebraic setting, but seems potentially useful also in knowledge representation based on predicate logic. We show that it can be modeled on the basis of first-order logic extended by second-order quantification. A wealth of classical results transfers, foundations for algorithms unfold, and connections with second-order quantifier elimination and Craig interpolation show up.
The PIE system aims at providing an environment for creating complex applications of automated first-order theorem proving techniques. It is embedded in Prolog. Beyond actual proving tasks, also interpolation and second-order quantifier elimination are supported. A macro feature and a LATEX formula pretty-printer facilitate the construction of elaborate formalizations from small, understandable and documented units. For use with interpolation and elimination, preprocessing operations allow to preserve the semantics of chosen predicates. The system comes with a built-in default prover that can compute interpolants.
We investigate possibilities to utilize techniques of computational logic for scholarly editing. Semantic Web technology already makes relevant large knowledge bases available in form of logic formulas. There are several further roles of logic-based reasoning in machine supported scholarly editing. KBSET, a free prototype system, provides a platform for experiments.
For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, projection and forgetting - operations that currently receive much attention in knowledge processing - always succeeds. The decidability proof for this class by Heinrich Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. Here we reconstruct the results from Behmann's publication in detail and discuss related issues that are relevant in the context of modern approaches to second-order quantifier elimination in computational logic. In addition, an extensive documentation of the letters and manuscripts in Behmann's bequest that concern second-order quantifier elimination is given, including a commented register and English abstracts of the German sources with focus on technical material. In the late 1920s Behmann attempted to develop an elimination-based decision method for formulas with predicates whose arity is larger than one. His manuscripts and the correspondence with Wilhelm Ackermann show technical aspects that are still of interest today and give insight into the genesis of Ackermann's landmark paper “Untersuchungen über das Eliminationsproblem der mathematischen Logik” from 1935, which laid the foundation of the two prevailing modern approaches to second-order quantifier elimination.
For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, forgetting and projection, always succeeds. The decidability proof for this class by Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. We reconstruct Behmann's method, relate it to the modern DLS elimination algorithm and show some applications where the essential monadicity becomes apparent only at second sight. In particular, deciding ALCOQH knowledge bases, elimination in DL-Lite knowledge bases, and the justification of the success of elimination methods for Sahlqvist formulas.
For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, forgetting and projection, always succeeds. The decidability proof for this class by Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. We reconstruct Behmann's method, relate it to the modern DLS elimination algorithm and show some applications where the essential monadicity becomes apparent only at second sight. In particular, deciding ALCOQH knowledge bases, elimination in DL-Lite knowledge bases, and the justification of the success of elimination methods for Sahlqvist formulas.
Predicate quantification can be applied to characterize definientia of a given formula that are in terms of a given set of predicates. Methods for second-order quantifier elimination and the closely related computation of forgetting, projection and uniform interpolants can then be applied to compute such definientia. Here we address the question, whether this principle can be transferred to definientia in given classes that allow efficient processing, such as Horn or Krom formulas. Indeed, if propositional logic is taken as basis, for the class of all formulas that are equivalent to a conjunction of atoms and the class of all formulas that are equivalent to a Krom formula, the existence of definientia as well as representative definientia themselves can be characterized in terms of predicate quantification. For the class of formulas that are equivalent to a Horn formula, this is possible with a special further operator. For first-order logic as basis, we indicate guidelines and open issues.
A prototype system is described whose core functionality is, based on propositional logic, the elimination of second-order operators, such as Boolean quantifiers and operators for projection, forgetting and circumscription. This approach allows to express many representational and computational tasks in knowledge representation - for example computation of abductive explanations and models with respect to logic programming semantics - in a uniform operational system, backed by a uniform classical semantic framework.
It is known that skeptical abductive explanations with respect to classical logic can be characterized semantically in a natural way as formulas with second-order quantifiers. Computing explanations is then just elimination of the second-order quantifiers. By using application patterns and generalizations of second-order quantification, like literal projection, the globally weakest sufficient condition and circumscription, we transfer these principles in a unifying framework to abduction with three non-classical semantics of logic programming: stable model, partial stable model and well-founded semantics. New insights are revealed about abduction with the partial stable model semantics.
We present a formalism that models the computation of clause sharing portfolio solvers with inprocessing. The soundness of these solvers is not a straightforward property since shared clauses can make a formula unsatisfiable. Therefore, we develop characterizations of simplification techniques and suggest various settings how clause sharing and inprocessing can be combined. Our formalization models most of the recent implemented portfolio systems and we indicate possibilities to improve these. A particular improvement is a novel way to combine clause addition techniques - like blocked clause addition - with clause deletion techniques - like blocked clause elimination or variable elimination.
Stenning and van Lambalgen introduced an approach to model empirically studied human reasoning with nonmonotonic logics. Some of the research questions that have been brought up in this context concern the interplay of the open- and closed-world assumption, the suitability of particular logic programming semantics for the modeling of human reasoning, and the role of three-valued logic programming semantics and three-valued logics. We look into these questions from the view of a framework where logic programs that model human reasoning are represented declaratively and are mechanizable by classical formulas extended with certain second-order operators.
We develop a semantic framework that extends first-order logic by literal projection and a novel second semantically defined operator, “raising”, which is only slightly different from literal projection and can be used to define a generalization of parallel circumscription with varied predicates in a straightforward and compact way. We call this variant of circumscription “scope-determined”, since like literal projection and raising its effects are controlled by a so-called “scope”, that is, a set of literals, as parameter. We work out formally a toolkit of propositions about projection, raising and circumscription and their interaction. It reveals some refinements of and new views on previously known properties. In particular, we apply it to show that well-foundedness with respect to circumscription can be expressed in terms of projection, and that a characterization of the consequences of circumscribed propositional formulas in terms of literal projection can be generalized to first-order logic and expressed compactly in terms of new variants of the strongest necessary and weakest sufficient condition.
Recently an approach to model human reasoning as studied in cognitive science by logic programming, has been introduced by Stenning and van Lambalgen and exemplified with the suppression task. We investigate this approach from the view of a framework where different logic programming semantics correspond to different translations of logic programs into formulas of classical two-valued logic extended by two second-order operators, circumscription and literal projection. Based on combining and extending previously known such renderings of logic programming semantics, we take semantics into account that have not yet been considered in the context of human reasoning, such as stable models and partial stable models. To model human reasoning, it is essential that only some predicates can be subjected to closed world reasoning, while others are handled by open world reasoning. In our framework, variants of familiar logic programing semantics that are extended with this feature are derived from a generic circumscription based representation. Further, we develop a two-valued representation of a three-valued logic that renders semantics considered for human reasoning based on the Fitting operator.
A prototype system is described whose core functionality is, based on propositional logic, the elimination of second-order operators, such as Boolean quantifiers and operators for projection, forgetting and circumscription. This approach allows to express many representational and computational tasks in knowledge representation - for example computation of abductive explanations and models with respect to logic programming semantics - in a uniform operational system, backed by a uniform classical semantic framework.
In this paper we contribute to bridging the gap between human reasoning as studied in Cognitive Science and commonsense reasoning based on formal logics and formal theories. In particular, the suppression task studied in Cognitive Science provides an interesting challenge problem for human reasoning based on logic. The work presented in the paper is founded on the recent approach by Stenning and van Lambalgen to model human reasoning by means of logic programs with a specific three-valued completion semantics and a semantic fixpoint operator that yields a least model, as well as abduction. Their approach has been subsequently made more precise and technically accurate by switching to three-valued Lukasiewicz logic. In this paper, we extend this refined approach by abduction. We show that the inclusion of abduction permits to adequately model additional empiric results reported from Cognitive Science. For the arising abductive reasoning tasks we give complexity results. Finally, we outline several open research issues that emerge from the application of logic to model human reasoning.
We pursue a representation of logic programs as classical first-order sentences. Different semantics for logic programs can then be expressed by the way in which they are wrapped into - semantically defined - operators for circumscription and projection. (Projection is a generalization of second-order quantification.) We demonstrate this for the stable model semantics, Clark's completion and a three-valued semantics based on the Fitting operator. To represent the latter, we utilize the polarity sensitiveness of projection, in contrast to second-order quantification, and a variant of circumscription that allows to express predicate minimization in parallel with maximization. In accord with the aim of an integrated view on different logic-based representation techniques, the material is worked out on the basis of first-order logic with a Herbrand semantics.
We develop a formal framework intended as a preliminary step for a single knowledge representation system that provides different representation techniques in a unified way. In particular we consider first-order logic extended by techniques for second-order quantifier elimination and non-monotonic reasoning. In this paper two independent results are developed. The background for the first result is literal projection, a generalization of second-order quantification which permits, so to speak, to quantify upon an arbitrary sets of ground literals, instead of just (all ground literals with) a given predicate symbol. We introduce an operator raise that is only slightly different from literal projection and can be used to define a generalization of predicate circumscription in a straightforward and compact way. We call this variant of circumscription scope-determined. Some properties of raise and scope-determined circumscription, also in combination with literal projection, are then shown. A previously known characterization of consequences of circumscribed formulas in terms of literal projection is generalized from propositional to first-order logic and proven on the basis of the introduced concepts. The second result developed in this paper is a characterization stable models in terms of circumscription. Unlike traditional characterizations, it does not recur onto syntactic notions like reduct and fixed-point construction. It essentially renders a recently proposed “circumscription-like” characterization in a compact way, without involvement of a non-classically interpreted connective.
Projection computation is a generalization of second-order quantifier elimination, which in turn is closely related to the computation of forgetting and of uniform interpolants. On the basis of a unified view on projection computation and knowledge compilation, we develop a framework for applying tableau methods to these tasks. It takes refinements from performance oriented systems into account. Formula simplifications are incorporated at the level of tableau structure modification, and at the level of simplifying encountered subformulas that are not yet fully compiled. In particular, such simplifications can involve projection computation, where this is possible with low cost. We represent tableau construction by means of rewrite rules on formulas, extended with some auxiliary functors, which is particularly convenient for formula transformation tasks. As instantiations of the framework, we discuss approaches to propositional knowledge compilation from the literature, including adaptions of DPLL, and the hyper tableau calculus for first-order clauses.
Projection is a logic operation which allows to express tasks in knowledge representation. These tasks involve extraction or removal of knowledge concerning a given sub-vocabulary. It is a generalization of second-order quantification, permitting, so to speak, to `quantify' upon an arbitrary set of ground literals instead of just (all ground literals with) a given predicate symbol. In Automated Deduction for Projection Elimination, a semantic characterization of projection for first-order logic is presented. On this basis, properties underlying applications and processing methods are derived. The computational processing of projection, called projection elimination in analogy to quantifier elimination, can be performed by adapted theorem proving methods. This is shown for resolvent generation and, more in depth, tableau construction. An abstract framework relates projection elimination with knowledge compilation and shows the adaption of key features of high performance tableau systems. As a prototypical instance, an adaption of a modern DPLL method, such as underlying state-of-the-art SAT solvers, is worked out. It generalizes various recent knowledge compilation methods and utilizes the interplay with projection elimination for efficiency improvements.
The computation of literal projection generalizes predicate quantifier elimination by permitting, so to speak, quantifying upon an arbitrary sets of ground literals, instead of just (all ground literals with) a given predicate symbol. Literal projection allows, for example, to express predicate quantification upon a predicate just in positive or negative polarity. Occurrences of the predicate in literals with the complementary polarity are then considered as unquantified predicate symbols. We present a formalization of literal projection and related concepts, such as literal forgetting, for first-order logic with a Herbrand semantics, which makes these notions easy to access, since they are expressed there by means of straightforward relationships between sets of literals. With this formalization, we show properties of literal projection which hold for formulas that are free of certain links, pairs of literals with complementary instances, each in a different conjunct of a conjunction, both in the scope of a universal first-order quantifier, or one in a subformula and the other in its context formula. These properties can justify the application of methods that construct formulas without such links to the computation of literal projection. Some tableau methods and direct methods for second-order quantifier elimination can be understood in this way.
The E-KRHyper system is a model generator and theorem prover for first-order logic with equality. It implements the new E-hyper tableau calculus, which integrates a superposition-based handling of equality into the hyper tableau calculus. E-KRHyper extends our previous KRHyper system, which has been used in a number of applications in the field of knowledge representation. In contrast to most first order theorem provers, it supports features important for such applications, for example queries with predicate extensions as answers, handling of large sets of uniformly structured input facts, arithmetic evaluation and stratified negation as failure. It is our goal to extend the range of application possibilities of KRHyper by adding equality reasoning.
Generalized methods for automated theorem proving can be used to compute formula transformations such as projection elimination and knowledge compilation. We present a framework based on clausal tableaux suited for such tasks. These tableaux are characterized independently of particular construction methods, but important features of empirically successful methods are taken into account, especially dependency directed backjumping and branch local operation. As an instance of that framework an adaption of DPLL is described. We show that knowledge compilation methods can be essentially improved by weaving projection elimination partially into the compilation phase.
Some operations to decompose a knowledge base (considered as a first order logic formula) in ways so that only its semantics determines the results are investigated. Intended uses include the extraction of “parts” relevant to an application, the exploration and utilizing of implicit possibilities of structuring a knowledge base and the formulation of query answers in terms of a signature demanded by an application. A semantic framework based on Herbrand interpretations is outlined. The notion of ´model relative to a scope¡ is introduced. It underlies the partitioning operations “projection” and “forgetting” and also provides a semantic account for certain formula simplification operations. An algorithmic approach which is based on resolution and may be regarded as a variation of the SCAN algorithm is discussed.
Three real world applications are depicted which all have a full first order theorem prover based on the hyper tableau calculus as their core component. These applications concern information retrieval in electronic publishing, the integration of description logics with other knowledge representation techniques and XML query processing.
KRHyper is a first order logic theorem proving and model generation system based on the hyper tableau calculus. It is targeted for use as an embedded system within knowledge based applications. In contrast to most first order theorem provers, it supports features important for those applications, for example queries with predicate extensions as answers, handling of large sets of uniformly structured input facts, arithmetic evaluation and stratified negation as failure.
We describe a concept for the cooperation of a computer algebra system, several automated theorem provers and a mathematical library. The purpose of this cooperation is to enable intelligent retrieval of theorems and definitions from the remote mathematical library. Automated theorem provers compete on remote machines to verify conjectures provided by the user in a local copy of Mathematica. They make use of a remote knowledge base which contains parts of the Mizar Mathematical Library.
Over the years, interactive theorem provers have built a large body of verified computer mathematics. The ILF Mathematical Library aims to make this knowledge available to other systems. One of the reasons for such a project is economy. Verification of software and hardware frequently requires the proof of purely mathematical theorems. It is obviously inefficient, to use the time of experts in the design of software or hardware systems to prove such theorems. Another reason for presenting a collection of mathematical theorems in a unified framework is safety. It should facilitate the verification of theorems in the library of one system by other systems. A third reason is dynamics of research. New interactive theorem provers should obtain the possibility to show their usability for real-world problems without having to reprove elementary mathematical facts. Last but not least, it is hoped that reproving theorems in a uniform mathematical library will be considered as a challenge to the development of automated theorem provers.
The idea of InfraEngine is to help establishing a Semantic Web infrastructure by a specially adapted AI planning engine. Inputs are distributed Web documents in Semantic Web formats proposed by the World Wide Web Consortium. Outputs are also delivered in such formats. The user interface allows to browse Semantic Web data and to control the planning services from any Web browser. Also other programs can make use of these services. A small and understandable, but general, mechanism that can be used for different kinds of applications should be provided.
We outline two application scenarios which might be in the realm of short term applications of the Semantic Web: a software packaging system and the organization of a business trip. Both of them can be solved with today's technology to some degree, so they do not show the novel potential of the Semantic Web in full. However considering Semantic Web solutions for them is useful to get a picture of the characteristics of the Semantic Web and become aware of some concrete technical issues.
We propose resource oriented inference as one way of bringing the Semantic Web into action. It provides a framework for expressing and processing a variety of tasks from areas such as planning, scheduling, manufacturing resource planning, product data management, configuration management, workflow management and simulation. Resource oriented inference as a part of the Semantic Web should allow such tasks to be performed within the scope of the World Wide Web. A prototypical application is be the purchase of a complex product with the help of the Web. The product consists of multiple parts, some of them complex products by themselves. Several services are required to compose the product. Subparts and services can be provided by different companies all over the world. Delivery time and cost of the product should be optimized.
Outline of a small Web embedded language for specifying types and reasoning about them. Its main features are: (1.) From primitive types, that require from their instances just that they implement a single method, complex types can be constructed by a few operators, such as as a type intersection. This composition of type definitions maps to the composition of fragments of type definitions in different Web documents. (2.) Types are compared by structural equivalence and not based on their names. This facilitates combination of independently developed models and implicit association of types with given instances. (3.) The hyperlinking of expressions of the modeling language can be understood in terms of defining equations of a rewriting system. The left side of such a rewrite rule is an URI, the right side an expression. Hyperlink dereferencing corresponds to a rewrite step with such a rule.