Home Page

Papers

Publications

Synthesizing Strongly Equivalent Logic Programs: Beth Definability for Answer Set Programs via Craig Interpolation in First-Order Logic
Jan Heuer and Christoph Wernhard
In Chris Benzmüller, Marjin Heule, and Renate Schmidt, editors, International Joint Conference on Automated Reasoning, IJCAR 2024, LNCS (LNAI). Springer, 2024.
To appear, preprint: https://arxiv.org/abs/2402.07696.
bib ]

We show a projective Beth definability theorem for logic programs under the stable model semantics: For given programs P and Q and vocabulary V (set of predicates) the existence of a program R in V such that P UR and P UQ are strongly equivalent can be expressed as a first-order entailment. Moreover, our result is effective: A program R can be constructed from a Craig interpolant for this entailment, using a known first-order encoding for testing strong equivalence, which we apply in reverse to extract programs from formulas. As a further perspective, this allows transforming logic programs via transforming their first-order encodings. In a prototypical implementation, the Craig interpolation is performed by first-order provers based on clausal tableaux or resolution calculi. Our work shows how definability and interpolation, which underlie modern logic-based approaches to advanced tasks in knowledge representation, transfer to answer set programming.

Synthesizing Nested Relational Queries from Implicit Specifications: Via Model Theory and via Proof Theory
Michael Benedikt, Cécilia Pradic, and Christoph Wernhard
Logical Methods in Computer Science, 2024.
To appear, preprint: http://arxiv.org/abs/2212.03085.
bib ]

Derived datasets can be defined implicitly or explicitly. An implicit definition (of dataset O in terms of datasets I) is a logical specification involving two distinguished sets of relational symbols. One set of relations is for the “source data” I, and the other is for the “interface data” O. Such a specification is a valid definition of O in terms of I, if any two models of the specification agreeing on I agree on O. In contrast, an explicit definition is a transformation (or “query” below) that produces O from I. Variants of Beth's theorem state that one can convert implicit definitions to explicit ones. Further, this conversion can be done effectively given a proof witnessing implicit definability in a suitable proof system. We prove the analogous implicit-to-explicit result for nested relations: implicit definitions, given in the natural logic for nested relations, can be converted to explicit definitions in the nested relational calculus (NRC). We first provide a model-theoretic argument for this result, which makes some additional connections that may be of independent interest between NRC queries, interpretations, a standard mechanism for defining structure-to-structure translation in logic, and between interpretations and implicit to definability “up to unique isomorphism”. The latter connection uses a variation of a result of Gaifman concerning “relatively categorical” theories. We also provide a proof-theoretic result that provides an effective argument: from a proof witnessing implicit definability, we can efficiently produce an NRC definition. This will involve introducing the appropriate proof system for reasoning with nested sets, along with some auxiliary Beth-type results for this system. As a consequence, we can effectively extract rewritings of NRC queries in terms of NRC views, given a proof witnessing that the query is determined by the views.

Structure-Generating First-Order Theorem Proving
Christoph Wernhard
In Jens Otten and Wolfgang Bibel, editors, AReCCa 2023 - Automated Reasoning with Connection Calculi, International Workshop, volume 3613 of CEUR Workshop Proceedings, pages 64-83. CEUR-WS.org, 2024.
bib | .pdf ]

Provers based on the connection method can become much more powerful than currently believed. We substantiate this thesis with certain generalizations of known techniques. In particular, we generalize proof structure enumeration interwoven with unification - the proceeding of goal-driven connection and clausal tableaux provers - to an interplay of goal- and axiom-driven processing. It permits lemma re-use and heuristic restrictions known from saturating provers. Proof structure terms, proof objects that allow to specify and implement various ways of building proofs, are central for our approach. Meredith's condensed detachment represents the prototypical base case of such proof structure terms. Hence, we focus on condensed detachment problems, first-order Horn problems of a specific form. Experiments show that for this problem class the approach keeps up with state-of-the-art first-order provers, leads to remarkably short proofs, solves an ATP challenge problem, and is useful in machine learning for ATP. A general aim is to make ATP more accessible to systematic investigations in the space between calculi and implementation aspects.

Range-Restricted and Horn Interpolation through Clausal Tableaux
Christoph Wernhard
In Revantha Ramanayake and Josef Urban, editors, Automated Reasoning with Analytic Tableaux and Related Methods: 32nd International Conference, TABLEAUX 2023, volume 14278 of LNCS (LNAI), pages 3-23. Springer, 2023.
[ Preprint (extended version): https://arxiv.org/abs/2306.03572 | Presentation slides ].
bib | DOI ]

We show how variations of range-restriction and also the Horn property can be passed from inputs to outputs of Craig interpolation in first-order logic. The proof system is clausal tableaux, which stems from first-order ATP. Our results are induced by a restriction of the clausal tableau structure, which can be achieved in general by a proof transformation, also if the source proof is by resolution/paramodulation. Primarily addressed applications are query synthesis and reformulation with interpolation. Our methodical approach combines operations on proof structures with the immediate perspective of feasible implementation through incorporating highly optimized first-order provers.

Lemmas: Generation, Selection, Application
Michael Rawson, Christoph Wernhard, Zsolt Zombori, and Wolfgang Bibel
In Revantha Ramanayake and Josef Urban, editors, Automated Reasoning with Analytic Tableaux and Related Methods: 32nd International Conference, TABLEAUX 2023, volume 14278 of LNCS (LNAI), pages 153-174. Springer, 2023.
[ Preprint (extended version): https://arxiv.org/abs/2303.05854 | Presentation slides ].
bib | DOI ]

Noting that lemmas are a key feature of mathematics, we engage in an investigation of the role of lemmas in automated theorem proving. The paper describes experiments with a combined system involving learning technology that generates useful lemmas for automated theorem provers, demonstrating improvement for several representative systems and solving a hard problem not solved by any system for twenty years. By focusing on condensed detachment problems we simplify the setting considerably, allowing us to get at the essence of lemmas and their role in proof search.

Learning to Identify Useful Lemmas from Failure
Michael Rawson, Christoph Wernhard, and Zsolt Zombori
In Michael R. Douglas, Thomas C. Hales, Cezary Kaliszyk, Stephan Schulz, and Josef Urban, editors, 8th Conference on Artificial Intelligence and Theorem Proving, AITP 2023 (Informal Book of Abstracts), 2023.
bib ]

Investigations into Proof Structures
Christoph Wernhard and Wolfgang Bibel
2023.
[ Preprint (submitted): https://arxiv.org/abs/2304.12827 ].
bib ]

We introduce and elaborate a novel formalism for the manipulation and analysis of proofs as objects in a global manner. In this first approach the formalism is restricted to first-order problems characterized by condensed detachment. It is applied in an exemplary manner to a coherent and comprehensive formal reconstruction and analysis of historical proofs of a widely-studied problem due to Łukasiewicz. The underlying approach opens the door towards new systematic ways of generating lemmas in the course of proof search to the effects of reducing the search effort and finding shorter proofs. Among the numerous reported experiments along this line, a proof of Łukasiewicz's problem was automatically discovered that is much shorter than any proof found before by man or machine.

Synthesizing Nested Relational Queries from Implicit Specifications
Michael Benedikt, Cécilia Pradic, and Christoph Wernhard
In Principles of Database Systems, PODS 23, pages 33-45, 2023.
bib | DOI ]

Derived datasets can be defined implicitly or explicitly. An implicit definition (of dataset O in terms of datasets I) is a logical specification involving the source data I and the interface data O. It is a valid definition of O in terms of I, if two models of the specification agreeing on I agree on O. In contrast, an explicit definition is a query that produces O from I. Variants of Beth's theorem state that one can convert implicit definitions to explicit ones. Further, this conversion can be done effectively given a proof witnessing implicit definability in a suitable proof system. We prove the analogous effective implicit-to-explicit result for nested relations: implicit definitions, given in the natural logic for nested relations, can be effectively converted to explicit definitions in the nested relational calculus (NRC). As a consequence, we can effectively extract rewritings of NRC queries in terms of NRC views, given a proof witnessing that the query is determined by the views.

Compressed Combinatory Proof Structures and Blending Goal- with Axiom-Driven Reasoning: Perspectives for First-Order ATP with Condensed Detachment and Clausal Tableaux
Christoph Wernhard
In Michael R. Douglas, Thomas C. Hales, Cezary Kaliszyk, Stephan Schulz, and Josef Urban, editors, 7th Conference on Artificial Intelligence and Theorem Proving, AITP 2022 (Informal Book of Abstracts), 2022.
bib ]

Generating Compressed Combinatory Proof Structures - An Approach to Automated First-Order Theorem Proving
Christoph Wernhard
In Boris Konev, Claudia Schon, and Alexander Steen, editors, 8th Workshop on Practical Aspects of Automated Reasoning, PAAR 2022, volume 3201 of CEUR Workshop Proceedings. CEUR-WS.org, 2022.
bib ]

Representing a proof tree by a combinator term that reduces to the tree lets subtle forms of duplication within the tree materialize as duplicated subterms of the combinator term. In a DAG representation of the combinator term these straightforwardly factor into shared subgraphs. To search for proofs, combinator terms can be enumerated, like clausal tableaux, interwoven with unification of formulas that are associated with nodes of the enumerated structures. To restrict the search space, the enumeration can be based on proof schemas defined as parameterized combinator terms. We introduce here this “combinator term as proof structure” approach to automated first-order proving, present an implementation and first experimental results. The approach builds on a term view of proof structures rooted in condensed detachment and the connection method. It realizes features known from the connection structure calculus, which has not been implemented so far.

CD Tools - Condensed Detachment and Structure Generating Theorem Proving (System Description)
Christoph Wernhard
Technical report, 2022.
bib | DOI ]

CD Tools is a Prolog library for experimenting with condensed detachment in first-order ATP, which puts a recent formal view centered around proof structures into practice. From the viewpoint of first-order ATP, condensed detachment offers a setting that is relatively simple but with essential features and serious applications, making it attractive as a basis for developing and evaluating novel techniques. CD Tools includes specialized provers based on the enumeration of proof structures. We focus here on one of these, SGCD, which permits to blend goal- and axiom-driven proof search in particularly flexible ways. In purely goal-driven configurations it acts similarly to a prover of the clausal tableaux or connection method family. In blended configurations its performance is much stronger, close to state-of-the-art provers, while emitting relatively short proofs. Experiments show characteristics and application possibilities of the structure generating approach realized by that prover. For a historic problem often studied in ATP it produced a new proof that is much shorter than any known one.

Proceedings of the Second Workshop on Second-Order Quantifier Elimination and Related Topics (SOQE 2021)
Renate A. Schmidt, Christoph Wernhard, and Yizheng Zhao, editors
Volume 3009 of CEUR Workshop Proceedings, Aachen, 2021. CEUR-WS.org.
bib | http ]

Applying Second-Order Quantifier Elimination in Inspecting Gödel's Ontological Proof
Christoph Wernhard
In Renate A. Schmidt, Christoph Wernhard, and Yizheng Zhao, editors, Proceedings of the Second Workshop on Second-Order Elimination and Related Topics (SOQE 2021), volume 3009 of CEUR Workshop Proceedings, pages 98-111, 2021.
[ Preprint (extended version): https://arxiv.org/abs/2110.11108 ].
bib | .pdf ]

In recent years, Gödel's ontological proof and variations of it were formalized and analyzed with automated tools in various ways. We supplement these analyses with a modeling in an automated environment based on first-order logic extended by predicate quantification. Formula macros are used to structure complex formulas and tasks. The analysis is presented as a generated type-set document where informal explanations are interspersed with pretty-printed formulas and outputs of reasoners for first-order theorem proving and second-order quantifier elimination. Previously unnoticed or obscured aspects and details of Gödel's proof become apparent. Practical application possibilities of second-order quantifier elimination are shown and the encountered elimination tasks may serve as benchmarks.

Learning from Łukasiewicz and Meredith: Investigations into Proof Structures
Christoph Wernhard and Wolfgang Bibel
In André Platzer and Geoff Sutcliffe, editors, Automated Deduction: CADE 2021, volume 12699 of LNCS (LNAI), pages 58-75. Springer, 2021.
[ Preprint (extended version): https://arxiv.org/abs/2104.13645 | Presentation slides ].
bib | DOI ]

The material presented in this paper contributes to establishing a basis deemed essential for substantial progress in Automated Deduction. It identifies and studies global features in selected problems and their proofs which offer the potential of guiding proof search in a more direct way. The studied problems are of the wide-spread form of “axiom(s) and rule(s) imply goal(s)”. The features include the well-known concept of lemmas. For their elaboration both human and automated proofs of selected theorems are taken into a close comparative consideration. The study at the same time accounts for a coherent and comprehensive formal reconstruction of historical work by Łukasiewicz, Meredith and others. First experiments resulting from the study indicate novel ways of lemma generation to supplement automated first-order provers of various families, strengthening in particular their ability to find short proofs.

Craig Interpolation with Clausal First-Order Tableaux
Christoph Wernhard
Journal of Automated Reasoning, 65(5):647-690, 2021.
bib | DOI ]

We develop foundations for computing Craig-Lyndon interpolants of two given formulas with first-order theorem provers that construct clausal tableaux. Provers that can be understood in this way include efficient machine-oriented systems based on calculi of two families: goal-oriented such as model elimination and the connection method, and bottom-up such as the hypertableau calculus. We present the first interpolation method for first-order proofs represented by closed tableaux that proceeds in two stages, similar to known interpolation methods for resolution proofs. The first stage is an induction on the tableau structure, which is sufficient to compute propositional interpolants. We show that this can linearly simulate different prominent propositional interpolation methods that operate by an induction on a resolution deduction tree. In the second stage, interpolant lifting, quantified variables that replace certain terms (constants and compound terms) by variables are introduced. We justify the correctness of interpolant lifting (for the case without built-in equality) abstractly on the basis of Herbrand's theorem and for a different characterization of the formulas to be lifted than in the literature. In addition, we discuss various subtle aspects that are relevant for the investigation and practical realization of first-order interpolation based on clausal tableaux.

Facets of the PIE Environment for Proving, Interpolating and Eliminating on the Basis of First-Order Logic
Christoph Wernhard
In Petra Hofstedt, Salvador Abreu, Ulrich John, Herbert Kuchen, and Dietmar Seipel, editors, Declarative Programming and Knowledge Management (DECLARE 2019), Revised Selected Papers, volume 12057 of LNCS (LNAI), pages 160-177. Springer, 2020.
bib | DOI ]

PIE is a Prolog-embedded environment for automated reasoning on the basis of first-order logic. Its main focus is on formulas, as constituents of complex formalizations that are structured through formula macros, and as outputs of reasoning tasks such as second-order quantifier elimination and Craig interpolation. It supports a workflow based on documents that intersperse macro definitions, invocations of reasoners, and LaTeX-formatted natural language text. Starting from various examples, the paper discusses features and application possibilities of PIE along with current limitations and issues for future research.

KBSET - Knowledge-Based Support for Scholarly Editing and Text Processing with Declarative LaTeX Markup and a Core Written in SWI-Prolog
Jana Kittelmann and Christoph Wernhard
In Petra Hofstedt, Salvador Abreu, Ulrich John, Herbert Kuchen, and Dietmar Seipel, editors, Declarative Programming and Knowledge Management (DECLARE 2019), Revised Selected Papers, volume 12057 of LNCS (LNAI), pages 178-196. Springer, 2020.
bib | DOI ]

KBSET is an environment that provides support for scholarly editing in two flavors: First, as a practical tool KBSET/Letters that accompanies the development of editions of correspondences (in particular from the 18th and 19th century), completely from source documents to PDF and HTML presentations. Second, as a prototypical tool KBSET/NER for experimentally investigating novel forms of working on editions that are centered around automated named entity recognition. KBSET can process declarative application-specific markup that is expressed in LaTeX notation and incorporate large external fact bases that are typically provided in RDF. KBSET includes specially developed LaTeX styles and a core system that is written in SWI-Prolog, which is used there in many roles, utilizing that it realizes the potential of Prolog as a unifying language.

Von der Transkription zur Wissensbasis. Zum Zusammenspiel von digitalen Editionstechniken und Formen der Wissensrepräsentation am Beispiel von Korrespondenzen Johann Georg Sulzers
Jana Kittelmann and Christoph Wernhard
In Jana Kittelmann and Anne Purschwitz, editors, Aufklärungsforschung digital. Konzepte, Methoden, Perspektiven, volume 10/2019 of IZEA - Kleine Schriften, pages 84-114. Mitteldeutscher Verlag, 2019.
bib ]

PIE - Proving, Interpolating and Eliminating on the Basis of First-Order Logic
Christoph Wernhard
In Salvador Abreu, Petra Hofstedt, Ulrich John, Herbert Kuchen, and Dietmar Seipel, editors, Pre-proceedings of the DECLARE 2019 Conference, volume abs/1909.04870 of CoRR, 2019.
bib | arXiv ]

PIE is a Prolog-embedded environment for automated reasoning on the basis of first-order logic. It includes a versatile formula macro system and supports the creation of documents that intersperse macro definitions, reasoner invocations and LaTeX-formatted natural language text. Invocation of various reasoners is supported: External provers as well as sub-systems of PIE, which include preprocessors, a Prolog-based first-order prover, methods for Craig interpolation and methods for second-order quantifier elimination.

KBSET - Knowledge-Based Support for Scholarly Editing and Text Processing
Jana Kittelmann and Christoph Wernhard
In Salvador Abreu, Petra Hofstedt, Ulrich John, Herbert Kuchen, and Dietmar Seipel, editors, Pre-proceedings of the DECLARE 2019 Conference, volume abs/1909.04870 of CoRR, 2019.
bib | arXiv ]

KBSET supports a practical workflow for scholarly editing, based on using LaTeX with dedicated commands for semantics-oriented markup and a Prolog-implemented core system. Prolog plays there various roles: as query language and access mechanism for large Semantic Web fact bases, as data representation of structured documents and as a workflow model for advanced application tasks. The core system includes a LaTeX parser and a facility for the identification of named entities. We also sketch future perspectives of this approach to scholarly editing based on techniques of computational logic.

Craig Interpolation and Access Interpolation with Clausal First-Order Tableaux
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 18-01, Technische Universität Dresden, 2018.
bib | arXiv ]

We develop foundations for computing Craig interpolants and similar intermediates of two given formulas with first-order theorem provers that construct clausal tableaux. Provers that can be understood in this way include efficient machine-oriented systems based on calculi of two families: goal-oriented like model elimination and the connection method, and bottom-up like the hyper tableau calculus. The presented method for Craig-Lyndon interpolation involves a lifting step where terms are replaced by quantified variables, similar as known for resolution-based interpolation, but applied to a differently characterized ground formula and proven correct more abstractly on the basis of Herbrand's theorem, independently of a particular calculus. Access interpolation is a recent form of interpolation for database query reformulation that applies to first-order formulas with relativized quantifiers and constrains the quantification patterns of predicate occurrences. It has been previously investigated in the framework of Smullyan's non-clausal tableaux. Here, in essence, we simulate these with the more machine-oriented clausal tableaux through structural constraints that can be ensured either directly by bottom-up tableau construction methods or, for closed clausal tableaux constructed with arbitrary calculi, by postprocessing with restructuring transformations.

Proceedings of the Workshop on Second-Order Quantifier Elimination and Related Topics, SOQE 2017
Patrick Koopmann, Sebastian Rudolph, Renate A. Schmidt, and Christoph Wernhard, editors
Volume 2013 of CEUR Workshop Proceedings. CEUR-WS.org, 2017.
bib | http ]

Approximating Resultants of Existential Second-Order Quantifier Elimination upon Universal Relational First-Order Formulas
Christoph Wernhard
In Patrick Koopmann, Sebastian Rudolph, Renate A. Schmidt, and Christoph Wernhard, editors, Proceedings of the Workshop on Second-Order Quantifier Elimination and Related Topics, SOQE 2017, volume 2013 of CEUR Workshop Proceedings, pages 82-98. CEUR-WS.org, 2017.
bib | .pdf ]

We investigate second-order quantifier elimination for a class of formulas characterized by a restriction on the quantifier prefix: existential predicate quantifiers followed by universal individual quantifiers and a relational matrix. For a given second-order formula of this class a possibly infinite sequence of universal first-order formulas that have increasing strength and are all entailed by the second-order formula can be constructed. Any first-order consequence of the second-order formula is a consequence of some member of the sequence. The sequence provides a recursive base for the first-order theory of the second-order formula, in the sense investigated by Craig. The restricted formula class allows to derive further properties, for example that the set of those members of the sequence that are equivalent to the second-order formula, or, more generally, have the same first-order consequences, is co-recursively enumerable. Also the set of first-order formulas that entails the second-order formula is co-recursively enumerable. These properties are proven with formula-based tools used in automated deduction, such as domain closure axioms, eliminating individual quantifiers by ground expansion, predicate quantifier elimination with Ackermann's Lemma, Craig interpolation and decidability of the Bernays-Schönfinkel-Ramsey class

The Boolean Solution Problem from the Perspective of Predicate Logic - Extended Version
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 17-01, Technische Universität Dresden, 2017.
bib | arXiv ]

Finding solution values for unknowns in Boolean equations was a principal reasoning mode in the Algebra of Logic of the 19th century. Schröder investigated it as Auflösungsproblem (solution problem). It is closely related to the modern notion of Boolean unification. Today it is commonly presented in an algebraic setting, but seems potentially useful also in knowledge representation based on predicate logic. We show that it can be modeled on the basis of first-order logic extended by second-order quantification. A wealth of classical results transfers, foundations for algorithms unfold, and connections with second-order quantifier elimination and Craig interpolation show up. Although for first-order inputs the set of solutions is recursively enumerable, the development of constructive methods remains a challenge. We identify some cases that allow constructions, most of them based on Craig interpolation, and show a method to take vocabulary restrictions on solution components into account.

The Boolean Solution Problem from the Perspective of Predicate Logic
Christoph Wernhard
In Clare Dixon and Marcelo Finger, editors, 11th International Symposium on Frontiers of Combining Systems, FroCoS 2017, volume 10483 of LNCS (LNAI), pages 333-350. Springer, 2017.
bib | DOI ]

Finding solution values for unknowns in Boolean equations was a principal reasoning mode in the Algebra of Logic of the 19th century. Schröder investigated it as Auflösungsproblem (solution problem). It is closely related to the modern notion of Boolean unification. Today it is commonly presented in an algebraic setting, but seems potentially useful also in knowledge representation based on predicate logic. We show that it can be modeled on the basis of first-order logic extended by second-order quantification. A wealth of classical results transfers, foundations for algorithms unfold, and connections with second-order quantifier elimination and Craig interpolation show up.

Craig Interpolation and Query Reformulation with Clausal First-Order Tableaux
Christoph Wernhard
Poster presentation at Automated Reasoning with Analytic Tableaux and Related Methods: 26th International Conference, TABLEAUX 2017, 2017.
bib | .pdf ]

The PIE system for Proving, Interpolating and Eliminating
Christoph Wernhard
In Pascal Fontaine, Stephan Schulz, and Josef Urban, editors, 5th Workshop on Practical Aspects of Automated Reasoning, PAAR 2016, volume 1635 of CEUR Workshop Proceedings, pages 125-138. CEUR-WS.org, 2016.
bib | .pdf ]

The PIE system aims at providing an environment for creating complex applications of automated first-order theorem proving techniques. It is embedded in Prolog. Beyond actual proving tasks, also interpolation and second-order quantifier elimination are supported. A macro feature and a LATEX formula pretty-printer facilitate the construction of elaborate formalizations from small, understandable and documented units. For use with interpolation and elimination, preprocessing operations allow to preserve the semantics of chosen predicates. The system comes with a built-in default prover that can compute interpolants.

Towards Knowledge-Based Assistance for Scholarly Editing
Jana Kittelmann and Christoph Wernhard
In Thomas C. Hales, Cezary Kaliszyk, Stephan Schulz, and Josef Urban, editors, 1st Conference on Artificial Intelligence and Theorem Proving, AITP 2016 (Book of Abstracts), pages 29-31, 2016.
bib | http ]

We investigate possibilities to utilize techniques of computational logic for scholarly editing. Semantic Web technology already makes relevant large knowledge bases available in form of logic formulas. There are several further roles of logic-based reasoning in machine supported scholarly editing. KBSET, a free prototype system, provides a platform for experiments.

Knowledge-Based Support for Scholarly Editing and Text Processing
Jana Kittelmann and Christoph Wernhard
In DHd 2016 - Digital Humanities im deutschsprachigen Raum: Modellierung - Vernetzung - Visualisierung. Die Digital Humanities als fächerübergreifendes Forschungsparadigma. Konferenzabstracts, pages 178-181, Duisburg, 2016. nisaba verlag.
bib | DOI ]

Heinrich Behmann's Contributions to Second-Order Quantifier Elimination from the View of Computational Logic
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 15-05, Technische Universität Dresden, 2015.
Revised 2017.
bib | arXiv ]

For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, projection and forgetting - operations that currently receive much attention in knowledge processing - always succeeds. The decidability proof for this class by Heinrich Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. Here we reconstruct the results from Behmann's publication in detail and discuss related issues that are relevant in the context of modern approaches to second-order quantifier elimination in computational logic. In addition, an extensive documentation of the letters and manuscripts in Behmann's bequest that concern second-order quantifier elimination is given, including a commented register and English abstracts of the German sources with focus on technical material. In the late 1920s Behmann attempted to develop an elimination-based decision method for formulas with predicates whose arity is larger than one. His manuscripts and the correspondence with Wilhelm Ackermann show technical aspects that are still of interest today and give insight into the genesis of Ackermann's landmark paper “Untersuchungen über das Eliminationsproblem der mathematischen Logik” from 1935, which laid the foundation of the two prevailing modern approaches to second-order quantifier elimination.

Some Fragments Towards Establishing Completeness Properties of Second-Order Quantifier Elimination Methods
Christoph Wernhard
Poster presentation at Jahrestreffen der GI Fachgruppe Deduktionssysteme, associated with CADE-25, 2015.
bib | .pdf ]

Second-Order Quantifier Elimination on Relational Monadic Formulas - A Basic Method and Some Less Expected Applications
Christoph Wernhard
In Hans de Nivelle, editor, Automated Reasoning with Analytic Tableaux and Related Methods: 24th International Conference, TABLEAUX 2015, volume 9323 of LNCS (LNAI), pages 249-265. Springer, 2015.
bib | DOI ]

For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, forgetting and projection, always succeeds. The decidability proof for this class by Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. We reconstruct Behmann's method, relate it to the modern DLS elimination algorithm and show some applications where the essential monadicity becomes apparent only at second sight. In particular, deciding ALCOQH knowledge bases, elimination in DL-Lite knowledge bases, and the justification of the success of elimination methods for Sahlqvist formulas.

Second-Order Quantifier Elimination on Relational Monadic Formulas - A Basic Method and Some Less Expected Applications (Extended Version)
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 15-04, Technische Universität Dresden, 2015.
bib | .pdf ]

For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, forgetting and projection, always succeeds. The decidability proof for this class by Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. We reconstruct Behmann's method, relate it to the modern DLS elimination algorithm and show some applications where the essential monadicity becomes apparent only at second sight. In particular, deciding ALCOQH knowledge bases, elimination in DL-Lite knowledge bases, and the justification of the success of elimination methods for Sahlqvist formulas.

Second-Order Characterizations of Definientia in Formula Classes
Christoph Wernhard
In Alexander Bolotov and Manfred Kerber, editors, Joint Automated Reasoning Workshop and Deduktionstreffen, ARW-DT 2014, pages 36-37, 2014.
Workshop presentation [ Slides ].
bib | .pdf ]

Application Patterns of Projection/Forgetting
Christoph Wernhard
In Laura Kovacs and Georg Weissenbacher, editors, Workshop on Interpolation: From Proofs to Applications, iPRA 2014, 2014.
Workshop presentation [ Slides ].
bib | .pdf ]

Second-Order Characterizations of Definientia in Formula Classes
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 14-03, Technische Universität Dresden, 2014.
bib | .pdf ]

Predicate quantification can be applied to characterize definientia of a given formula that are in terms of a given set of predicates. Methods for second-order quantifier elimination and the closely related computation of forgetting, projection and uniform interpolants can then be applied to compute such definientia. Here we address the question, whether this principle can be transferred to definientia in given classes that allow efficient processing, such as Horn or Krom formulas. Indeed, if propositional logic is taken as basis, for the class of all formulas that are equivalent to a conjunction of atoms and the class of all formulas that are equivalent to a Krom formula, the existence of definientia as well as representative definientia themselves can be characterized in terms of predicate quantification. For the class of formulas that are equivalent to a Horn formula, this is possible with a special further operator. For first-order logic as basis, we indicate guidelines and open issues.

Semantik, Linked Data, Web-Präsentation: Grundlagen der Nachlasserschließung im Portal www.pueckler-digital.de
Jana Kittelmann and Christoph Wernhard
In Anne Baillot and Anna Busch, editors, Workshop Datenmodellierung in digitalen Briefeditionen und ihre interpretatorische Leistung. Humboldt-Universität zu Berlin, 2014.
Poster presentation, [ Abstract | Poster ].
bib ]

Expressing View-Based Query Processing and Related Approaches with Second-Order Operators
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 14-02, Technische Universität Dresden, 2014.
bib | .pdf ]

Modeling the Suppression Task under Weak Completion and Well-Founded Semantics
Emmanuelle-Anna Dietz, Steffen Hölldobler, and Christoph Wernhard
Journal of Applied Non-Classsical Logics, 24(1-2):61-85, 2014.
bib ]

Computing with Logic as Operator Elimination: The ToyElim System
Christoph Wernhard
In Hans Tompits, Salvador Abreu, Johannes Oetsch, Jörg Pührer, Dietmar Seipel, Masanobu Umeda, and Armin Wolf, editors, Applications of Declarative Programming and Knowledge Management, 19th International Conference (INAP 2011) and 25th Workshop on Logic Programming (WLP 2011), Revised Selected Papers, volume 7773 of LNCS (LNAI), pages 289-296. Springer, 2013.
bib | DOI ]

A prototype system is described whose core functionality is, based on propositional logic, the elimination of second-order operators, such as Boolean quantifiers and operators for projection, forgetting and circumscription. This approach allows to express many representational and computational tasks in knowledge representation - for example computation of abductive explanations and models with respect to logic programming semantics - in a uniform operational system, backed by a uniform classical semantic framework.

Semantik, Web, Metadaten und digitale Edition: Grundlagen und Ziele der Erschließung neuer Quellen des Branitzer Pückler-Archivs
Jana Kittelmann and Christoph Wernhard
In Irene Krebs et al., editors, Resonanzen. Pücklerforschung im Spannungsfeld zwischen Wissenschaft und Kunst. Ein Konferenzbericht., pages 179-202. trafo Verlag, Berlin, 2013.
bib ]

Möglichkeiten der Repräsentation von Nachlassarchiven im Web
Jana Kittelmann and Christoph Wernhard
In Fachtagung Semantische Technologien - Verwertungsstrategien und Konvergenz, Humboldt-Universität zu Berlin, Position Papers, 2013.
bib | http ]

Abduction in Logic Programming as Second-Order Quantifier Elimination
Christoph Wernhard
In Pascal Fontaine, Christophe Ringeissen, and Renate A. Schmidt, editors, 9th International Symposium on Frontiers of Combining Systems, FroCoS 2013, volume 8152 of LNCS (LNAI), pages 103-119. Springer, 2013.
bib | DOI ]

It is known that skeptical abductive explanations with respect to classical logic can be characterized semantically in a natural way as formulas with second-order quantifiers. Computing explanations is then just elimination of the second-order quantifiers. By using application patterns and generalizations of second-order quantification, like literal projection, the globally weakest sufficient condition and circumscription, we transfer these principles in a unifying framework to abduction with three non-classical semantics of logic programming: stable model, partial stable model and well-founded semantics. New insights are revealed about abduction with the partial stable model semantics.

Soundness of Inprocessing in Clause Sharing SAT Solvers
Norbert Manthey, Tobias Philipp, and Christoph Wernhard
In Matti Järvisalo and Allen Van Gelder, editors, Theory and Applications of Satisfiability Testing, 16th International Conference, SAT 2013, volume 7962 of LNCS, pages 22-39. Springer, 2013.
Received the SAT 2013 Best Paper Award [ Preprint ].
bib | DOI ]

We present a formalism that models the computation of clause sharing portfolio solvers with inprocessing. The soundness of these solvers is not a straightforward property since shared clauses can make a formula unsatisfiable. Therefore, we develop characterizations of simplification techniques and suggest various settings how clause sharing and inprocessing can be combined. Our formalization models most of the recent implemented portfolio systems and we indicate possibilities to improve these. A particular improvement is a novel way to combine clause addition techniques - like blocked clause addition - with clause deletion techniques - like blocked clause elimination or variable elimination.

Towards a Declarative Approach to Model Human Reasoning with Nonmonotonic Logics
Christoph Wernhard
In Thomas Barkowsky, Marco Ragni, and Frieder Stolzenburg, editors, Human Reasoning and Automated Deduction: KI 2012 Workshop Proceedings, volume SFB/TR 8 Report 032-09/2012 of Report Series of the Transregional Collaborative Research Center SFB/TR 8 Spatial Cognition, pages 41-48. Universität Bremen / Universität Freiburg, Germany, 2012.
bib ]

Stenning and van Lambalgen introduced an approach to model empirically studied human reasoning with nonmonotonic logics. Some of the research questions that have been brought up in this context concern the interplay of the open- and closed-world assumption, the suitability of particular logic programming semantics for the modeling of human reasoning, and the role of three-valued logic programming semantics and three-valued logics. We look into these questions from the view of a framework where logic programs that model human reasoning are represented declaratively and are mechanizable by classical formulas extended with certain second-order operators.

Projection and Scope-Determined Circumscription
Christoph Wernhard
Journal of Symbolic Computation, 47:1089-1108, 2012.
bib | DOI ]

We develop a semantic framework that extends first-order logic by literal projection and a novel second semantically defined operator, “raising”, which is only slightly different from literal projection and can be used to define a generalization of parallel circumscription with varied predicates in a straightforward and compact way. We call this variant of circumscription “scope-determined”, since like literal projection and raising its effects are controlled by a so-called “scope”, that is, a set of literals, as parameter. We work out formally a toolkit of propositions about projection, raising and circumscription and their interaction. It reveals some refinements of and new views on previously known properties. In particular, we apply it to show that well-foundedness with respect to circumscription can be expressed in terms of projection, and that a characterization of the consequences of circumscribed propositional formulas in terms of literal projection can be generalized to first-order logic and expressed compactly in terms of new variants of the strongest necessary and weakest sufficient condition.

Forward Human Reasoning Modeled by Logic Programming Modeled by Classical Logic with Circumscription and Projection
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 11-07, Technische Universität Dresden, 2011.
bib | .pdf ]

Recently an approach to model human reasoning as studied in cognitive science by logic programming, has been introduced by Stenning and van Lambalgen and exemplified with the suppression task. We investigate this approach from the view of a framework where different logic programming semantics correspond to different translations of logic programs into formulas of classical two-valued logic extended by two second-order operators, circumscription and literal projection. Based on combining and extending previously known such renderings of logic programming semantics, we take semantics into account that have not yet been considered in the context of human reasoning, such as stable models and partial stable models. To model human reasoning, it is essential that only some predicates can be subjected to closed world reasoning, while others are handled by open world reasoning. In our framework, variants of familiar logic programing semantics that are extended with this feature are derived from a generic circumscription based representation. Further, we develop a two-valued representation of a three-valued logic that renders semantics considered for human reasoning based on the Fitting operator.

Computing with Logic as Operator Elimination: The ToyElim System
Christoph Wernhard
In Proceedings of the 25th Workshop on Logic Programming, WLP 2011, Infsys Research Report 1843-11-06, pages 94-98. Technische Universität Wien, 2011.
bib | arXiv ]

A prototype system is described whose core functionality is, based on propositional logic, the elimination of second-order operators, such as Boolean quantifiers and operators for projection, forgetting and circumscription. This approach allows to express many representational and computational tasks in knowledge representation - for example computation of abductive explanations and models with respect to logic programming semantics - in a uniform operational system, backed by a uniform classical semantic framework.

An Abductive Model for Human Reasoning
Steffen Hölldobler, Tobias Philipp, and Christoph Wernhard
In Logical Formalizations of Commonsense Reasoning, Papers from the AAAI 2011 Spring Symposium, AAAI Spring Symposium Series Technical Reports, pages 135-138. AAAI Press, 2011.
bib ]

In this paper we contribute to bridging the gap between human reasoning as studied in Cognitive Science and commonsense reasoning based on formal logics and formal theories. In particular, the suppression task studied in Cognitive Science provides an interesting challenge problem for human reasoning based on logic. The work presented in the paper is founded on the recent approach by Stenning and van Lambalgen to model human reasoning by means of logic programs with a specific three-valued completion semantics and a semantic fixpoint operator that yields a least model, as well as abduction. Their approach has been subsequently made more precise and technically accurate by switching to three-valued Łukasiewicz logic. In this paper, we extend this refined approach by abduction. We show that the inclusion of abduction permits to adequately model additional empiric results reported from Cognitive Science. For the arising abductive reasoning tasks we give complexity results. Finally, we outline several open research issues that emerge from the application of logic to model human reasoning.

Circumscription and Projection as Primitives of Logic Programming
Christoph Wernhard
In M. Hermenegildo and T. Schaub, editors, Technical Communications of the 26th International Conference on Logic Programming, ICLP'10, volume 7 of Leibniz International Proceedings in Informatics (LIPIcs), pages 202-211, Dagstuhl, Germany, 2010. Schloss Dagstuhl-Leibniz-Zentrum für Informatik.
bib | DOI | http ]

We pursue a representation of logic programs as classical first-order sentences. Different semantics for logic programs can then be expressed by the way in which they are wrapped into - semantically defined - operators for circumscription and projection. (Projection is a generalization of second-order quantification.) We demonstrate this for the stable model semantics, Clark's completion and a three-valued semantics based on the Fitting operator. To represent the latter, we utilize the polarity sensitiveness of projection, in contrast to second-order quantification, and a variant of circumscription that allows to express predicate minimization in parallel with maximization. In accord with the aim of an integrated view on different logic-based representation techniques, the material is worked out on the basis of first-order logic with a Herbrand semantics.

Literal Projection and Circumscription
Christoph Wernhard
In Nicolas Peltier and Viorica Sofronie-Stokkermans, editors, Proceedings of the 7th International Workshop on First-Order Theorem Proving, FTP'09, volume 556 of CEUR Workshop Proceedings, pages 60-74. CEUR-WS.org, 2010.
bib | http ]

We develop a formal framework intended as a preliminary step for a single knowledge representation system that provides different representation techniques in a unified way. In particular we consider first-order logic extended by techniques for second-order quantifier elimination and non-monotonic reasoning. In this paper two independent results are developed. The background for the first result is literal projection, a generalization of second-order quantification which permits, so to speak, to quantify upon an arbitrary sets of ground literals, instead of just (all ground literals with) a given predicate symbol. We introduce an operator raise that is only slightly different from literal projection and can be used to define a generalization of predicate circumscription in a straightforward and compact way. We call this variant of circumscription scope-determined. Some properties of raise and scope-determined circumscription, also in combination with literal projection, are then shown. A previously known characterization of consequences of circumscribed formulas in terms of literal projection is generalized from propositional to first-order logic and proven on the basis of the introduced concepts. The second result developed in this paper is a characterization stable models in terms of circumscription. Unlike traditional characterizations, it does not recur onto syntactic notions like reduct and fixed-point construction. It essentially renders a recently proposed “circumscription-like” characterization in a compact way, without involvement of a non-classically interpreted connective.

Tableaux for Projection Computation and Knowledge Compilation
Christoph Wernhard
In Martin Giese and Arild Waaler, editors, Automated Reasoning with Analytic Tableaux and Related Methods: 18th International Conference, TABLEAUX 2009, volume 5607 of LNCS (LNAI), pages 325-340. Springer, 2009.
bib | DOI | .pdf ]

Projection computation is a generalization of second-order quantifier elimination, which in turn is closely related to the computation of forgetting and of uniform interpolants. On the basis of a unified view on projection computation and knowledge compilation, we develop a framework for applying tableau methods to these tasks. It takes refinements from performance oriented systems into account. Formula simplifications are incorporated at the level of tableau structure modification, and at the level of simplifying encountered subformulas that are not yet fully compiled. In particular, such simplifications can involve projection computation, where this is possible with low cost. We represent tableau construction by means of rewrite rules on formulas, extended with some auxiliary functors, which is particularly convenient for formula transformation tasks. As instantiations of the framework, we discuss approaches to propositional knowledge compilation from the literature, including adaptions of DPLL, and the hyper tableau calculus for first-order clauses.

Automated Deduction for Projection Elimination
Christoph Wernhard
Number 324 in Dissertations in Artificial Intelligence. AKA Verlag/IOS Press, Heidelberg, Amsterdam, 2009.
bib | http ]

Projection is a logic operation which allows to express tasks in knowledge representation. These tasks involve extraction or removal of knowledge concerning a given sub-vocabulary. It is a generalization of second-order quantification, permitting, so to speak, to `quantify' upon an arbitrary set of ground literals instead of just (all ground literals with) a given predicate symbol. In Automated Deduction for Projection Elimination, a semantic characterization of projection for first-order logic is presented. On this basis, properties underlying applications and processing methods are derived. The computational processing of projection, called projection elimination in analogy to quantifier elimination, can be performed by adapted theorem proving methods. This is shown for resolvent generation and, more in depth, tableau construction. An abstract framework relates projection elimination with knowledge compilation and shows the adaption of key features of high performance tableau systems. As a prototypical instance, an adaption of a modern DPLL method, such as underlying state-of-the-art SAT solvers, is worked out. It generalizes various recent knowledge compilation methods and utilizes the interplay with projection elimination for efficiency improvements.

HydraSAT 2009.3 Solver Description
Christoph Baldow, Friedrich Gräter, Steffen Hölldobler, Norbert Manthey, Max Seelemann, Peter Steinke, Christoph Wernhard, Konrad Winkler, and Erik Zenker
In Daniel Le Berre et al., editor, SAT 2009 Competitive Event Booklet, pages 15-16, 2009.
bib | .pdf ]

Literal Projection for First-Order Logic
Christoph Wernhard
In Steffen Hölldobler, Carsten Lutz, and Heinrich Wansing, editors, Logics in Artificial Intelligence: 11th European Conference, JELIA 08, volume 5293 of LNCS (LNAI), pages 389-402. Springer, 2008.
bib | DOI ]

The computation of literal projection generalizes predicate quantifier elimination by permitting, so to speak, quantifying upon an arbitrary sets of ground literals, instead of just (all ground literals with) a given predicate symbol. Literal projection allows, for example, to express predicate quantification upon a predicate just in positive or negative polarity. Occurrences of the predicate in literals with the complementary polarity are then considered as unquantified predicate symbols. We present a formalization of literal projection and related concepts, such as literal forgetting, for first-order logic with a Herbrand semantics, which makes these notions easy to access, since they are expressed there by means of straightforward relationships between sets of literals. With this formalization, we show properties of literal projection which hold for formulas that are free of certain links, pairs of literals with complementary instances, each in a different conjunct of a conjunction, both in the scope of a universal first-order quantifier, or one in a subformula and the other in its context formula. These properties can justify the application of methods that construct formulas without such links to the computation of literal projection. Some tableau methods and direct methods for second-order quantifier elimination can be understood in this way.

System Description: E-KRHyper
Björn Pelzer and Christoph Wernhard
In Frank Pfenning, editor, Automated Deduction: CADE-21, volume 4603 of LNCS (LNAI), pages 503-513. Springer, 2007.
bib | DOI | .pdf ]

The E-KRHyper system is a model generator and theorem prover for first-order logic with equality. It implements the new E-hyper tableau calculus, which integrates a superposition-based handling of equality into the hyper tableau calculus. E-KRHyper extends our previous KRHyper system, which has been used in a number of applications in the field of knowledge representation. In contrast to most first order theorem provers, it supports features important for such applications, for example queries with predicate extensions as answers, handling of large sets of uniformly structured input facts, arithmetic evaluation and stratified negation as failure. It is our goal to extend the range of application possibilities of KRHyper by adding equality reasoning.

Tableaux Between Proving, Projection and Compilation
Christoph Wernhard
Technical Report Arbeitsberichte aus dem Fachbereich Informatik 18/2007, Universität Koblenz-Landau, Institut für Informatik, Universitätsstr. 1, 56070 Koblenz, Germany, 2007.
bib ]

Generalized methods for automated theorem proving can be used to compute formula transformations such as projection elimination and knowledge compilation. We present a framework based on clausal tableaux suited for such tasks. These tableaux are characterized independently of particular construction methods, but important features of empirically successful methods are taken into account, especially dependency directed backjumping and branch local operation. As an instance of that framework an adaption of DPLL is described. We show that knowledge compilation methods can be essentially improved by weaving projection elimination partially into the compilation phase.

Semantic Knowledge Partitioning
Christoph Wernhard
In José Júlio Alferes and João Leite Leite, editors, Logics in Artificial Intelligence: 9th European Conference, JELIA 04, volume 3229 of LNCS (LNAI), pages 552-564. Springer, 2004.
bib | DOI | .ps ]

Some operations to decompose a knowledge base (considered as a first order logic formula) in ways so that only its semantics determines the results are investigated. Intended uses include the extraction of “parts” relevant to an application, the exploration and utilizing of implicit possibilities of structuring a knowledge base and the formulation of query answers in terms of a signature demanded by an application. A semantic framework based on Herbrand interpretations is outlined. The notion of ´model relative to a scope¡ is introduced. It underlies the partitioning operations “projection” and “forgetting” and also provides a semantic account for certain formula simplification operations. An algorithmic approach which is based on resolution and may be regarded as a variation of the SCAN algorithm is discussed.

Semantic Knowledge Partitioning (Extended Abstract)
Christoph Wernhard
In Ulrike Sattler, editor, Contributions to the Doctoral Programme of the Second International Joint Conference on Automated Reasoning, IJCAR 2004, volume 106 of CEUR Workshop Proceedings. CEUR-WS.org, 2004.
bib | http ]

KRHyper Inside - Model Based Deduction in Applications
Peter Baumgartner, Ulrich Furbach, Margret Gross-Hardt, Thomas Kleemann, and Christoph Wernhard
In Proceedings of the CADE-19 workshop Challenges and Novel Applications for Automated Reasoning, pages 55-72, 2003.
bib | .ps ]

Three real world applications are depicted which all have a full first order theorem prover based on the hyper tableau calculus as their core component. These applications concern information retrieval in electronic publishing, the integration of description logics with other knowledge representation techniques and XML query processing.

System Description: KRHyper
Christoph Wernhard
Technical Report Fachberichte Informatik 14-2003, Universität Koblenz-Landau, Institut für Informatik, Universitätsstr. 1, 56070 Koblenz, Germany, 2003.
Presented at the CADE-19 workshop Model Computation: Principles, Algorithms, Applications.
bib | .ps ]

KRHyper is a first order logic theorem proving and model generation system based on the hyper tableau calculus. It is targeted for use as an embedded system within knowledge based applications. In contrast to most first order theorem provers, it supports features important for those applications, for example queries with predicate extensions as answers, handling of large sets of uniformly structured input facts, arithmetic evaluation and stratified negation as failure.

Using Mathematica and Automated Theorem Provers to Access a Mathematical Library
Ingo Dahn, Andreas Haida, Thomas Honigmann, and Christoph Wernhard
In Proceedings of the CADE-15 Workshop on Integration of Deductive Systems, pages 36-43, 1998.
(Revised and extended version: http://cs.christophwernhard.com/papers/integ.ps).
bib | .ps.gz ]

We describe a concept for the cooperation of a computer algebra system, several automated theorem provers and a mathematical library. The purpose of this cooperation is to enable intelligent retrieval of theorems and definitions from the remote mathematical library. Automated theorem provers compete on remote machines to verify conjectures provided by the user in a local copy of Mathematica. They make use of a remote knowledge base which contains parts of the Mizar Mathematical Library.

First Order Proof Problems Extracted from an Article in the Mizar Mathematical Library
Ingo Dahn and Christoph Wernhard
In Maria Paola Bonacina and Ulrich Furbach, editors, International Workshop on First-Order Theorem Proving, FTP'97, RISC-Linz Report Series No. 97-50, pages 58-62. Johannes Kepler Universität, Linz, 1997.
bib | .ps ]

Over the years, interactive theorem provers have built a large body of verified computer mathematics. The ILF Mathematical Library aims to make this knowledge available to other systems. One of the reasons for such a project is economy. Verification of software and hardware frequently requires the proof of purely mathematical theorems. It is obviously inefficient, to use the time of experts in the design of software or hardware systems to prove such theorems. Another reason for presenting a collection of mathematical theorems in a unified framework is safety. It should facilitate the verification of theorems in the library of one system by other systems. A third reason is dynamics of research. New interactive theorem provers should obtain the possibility to show their usability for real-world problems without having to reprove elementary mathematical facts. Last but not least, it is hoped that reproving theorems in a uniform mathematical library will be considered as a challenge to the development of automated theorem provers.

DB-CLOS: Eine Datenbankschnittstelle für das Common Lisp Object System
Heinz Schweppe, Christoph Wernhard, and Jutta Estenfeld
In W. Remmele, editor, Künstliche Intelligenz in der Praxis. Siemens AG, 1990.
bib ]

Unpublished Material

InfraEngine: Inferencing in the Semantic Web by Planning
Christoph Wernhard
System description (edited 2007). (A prototype implementation can be downloaded from http://www.infraengine.com/), 2002.
bib | .pdf ]

The idea of InfraEngine is to help establishing a Semantic Web infrastructure by a specially adapted AI planning engine. Inputs are distributed Web documents in Semantic Web formats proposed by the World Wide Web Consortium. Outputs are also delivered in such formats. The user interface allows to browse Semantic Web data and to control the planning services from any Web browser. Also other programs can make use of these services. A small and understandable, but general, mechanism that can be used for different kinds of applications should be provided.

Representing Proofs in the Semantic Web
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany, 2001.
bib | .pdf ]

First steps towards an RDF format for exchanging proofs in the Semantic Web.

Two Short Term Applications of the Semantic Web
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany, 2001.
bib | .pdf ]

We outline two application scenarios which might be in the realm of short term applications of the Semantic Web: a software packaging system and the organization of a business trip. Both of them can be solved with today's technology to some degree, so they do not show the novel potential of the Semantic Web in full. However considering Semantic Web solutions for them is useful to get a picture of the characteristics of the Semantic Web and become aware of some concrete technical issues.

The Planning Web in Action
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany, 2000.
bib | .pdf ]

We propose resource oriented inference as one way of bringing the Semantic Web into action. It provides a framework for expressing and processing a variety of tasks from areas such as planning, scheduling, manufacturing resource planning, product data management, configuration management, workflow management and simulation. Resource oriented inference as a part of the Semantic Web should allow such tasks to be performed within the scope of the World Wide Web. A prototypical application is be the purchase of a complex product with the help of the Web. The product consists of multiple parts, some of them complex products by themselves. Several services are required to compose the product. Subparts and services can be provided by different companies all over the world. Delivery time and cost of the product should be optimized.

Towards a Semantic Web Modeling Language
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany. Presented at KnowTech 2000, Leipzig, 2000.
bib | .pdf ]

Outline of a small Web embedded language for specifying types and reasoning about them. Its main features are: (1.) From primitive types, that require from their instances just that they implement a single method, complex types can be constructed by a few operators, such as as a type intersection. This composition of type definitions maps to the composition of fragments of type definitions in different Web documents. (2.) Types are compared by structural equivalence and not based on their names. This facilitates combination of independently developed models and implicit association of types with given instances. (3.) The hyperlinking of expressions of the modeling language can be understood in terms of defining equations of a rewriting system. The left side of such a rewrite rule is an URI, the right side an expression. Hyperlink dereferencing corresponds to a rewrite step with such a rule.

Experiments with a Linear Backward Chaining Planner
Christoph Wernhard
(Edited May 2003), 1999.
bib | .ps ]

The performance of an implementation of the linear backward chaining planning algorithm is compared to other planning systems by means of the problem set of the first ai planning systems competition (AIPS-98).

Entwurf und Implementierung einer Datenbankschnittstelle für das Common Lisp Object System
Christoph Wernhard
Magisterarbeit, Freie Universität Berlin, Berlin, Germany, 1992.
(The system documentation is available from http://cs.christophwernhard.com/closdb/.).
bib ]

Home Page