Home Page

Papers

Publications

The Boolean Solution Problem from the Perspective of Predicate Logic - Extended Version
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 17-01, Technische Universität Dresden, 2017.
bib | http ]

Finding solution values for unknowns in Boolean equations was a principal reasoning mode in the Algebra of Logic of the 19th century. Schröder investigated it as Auflösungsproblem (solution problem). It is closely related to the modern notion of Boolean unification. Today it is commonly presented in an algebraic setting, but seems potentially useful also in knowledge representation based on predicate logic. We show that it can be modeled on the basis of first-order logic extended by second-order quantification. A wealth of classical results transfers, foundations for algorithms unfold, and connections with second-order quantifier elimination and Craig interpolation show up. Although for first-order inputs the set of solutions is recursively enumerable, the development of constructive methods remains a challenge. We identify some cases that allow constructions, most of them based on Craig interpolation, and show a method to take vocabulary restrictions on solution components into account.

The Boolean Solution Problem from the Perspective of Predicate Logic
Christoph Wernhard
In Clare Dixon and Marcelo Finger, editors, 11th International Symposium on Frontiers of Combining Systems, FroCoS 2017, LNCS (LNAI). Springer, 2017.
To appear [ Preprint ].
bib ]

Finding solution values for unknowns in Boolean equations was a principal reasoning mode in the Algebra of Logic of the 19th century. Schröder investigated it as Auflösungsproblem (solution problem). It is closely related to the modern notion of Boolean unification. Today it is commonly presented in an algebraic setting, but seems potentially useful also in knowledge representation based on predicate logic. We show that it can be modeled on the basis of first-order logic extended by second-order quantification. A wealth of classical results transfers, foundations for algorithms unfold, and connections with second-order quantifier elimination and Craig interpolation show up.

The PIE system for Proving, Interpolating and Eliminating
Christoph Wernhard
In Pascal Fontaine, Stephan Schulz, and Josef Urban, editors, 5th Workshop on Practical Aspects of Automated Reasoning (PAAR), number 1635 in CEUR Workshop Proceedings, pages 125-138, Aachen, 2016.
bib | http ]

The PIE system aims at providing an environment for creating complex applications of automated first-order theorem proving techniques. It is embedded in Prolog. Beyond actual proving tasks, also interpolation and second-order quantifier elimination are supported. A macro feature and a LATEX formula pretty-printer facilitate the construction of elaborate formalizations from small, understandable and documented units. For use with interpolation and elimination, preprocessing operations allow to preserve the semantics of chosen predicates. The system comes with a built-in default prover that can compute interpolants.

Towards Knowledge-Based Assistance for Scholarly Editing
Jana Kittelmann and Christoph Wernhard
In Thomas C. Hales, Cezary Kaliszyk, Stephan Schulz, and Josef Urban, editors, 1st Conference on Artificial Intelligence and Theorem Proving, AITP 2016 (Book of Abstracts), pages 29-31, 2016.
bib | .pdf ]

We investigate possibilities to utilize techniques of computational logic for scholarly editing. Semantic Web technology already makes relevant large knowledge bases available in form of logic formulas. There are several further roles of logic-based reasoning in machine supported scholarly editing. KBSET, a free prototype system, provides a platform for experiments.

Knowledge-Based Support for Scholarly Editing and Text Processing
Jana Kittelmann and Christoph Wernhard
In DHd 2016 - Digital Humanities im deutschsprachigen Raum: Modellierung - Vernetzung - Visualisierung. Die Digital Humanities als fächerübergreifendes Forschungsparadigma. Konferenzabstracts, pages 176-179. nisaba verlag, 2016.
bib | .pdf ]

Heinrich Behmann's Contributions to Second-Order Quantifier Elimination from the View of Computational Logic
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 15-05, Technische Universität Dresden, 2015.
bib | .pdf ]

For relational monadic formulas (the Löwenheim class) second-ord er quantifier elimination, which is closely related to computation of uniform interpolants, projection and forgetting - operations that currently receive much attention in knowledge processing - always succeeds. The decidability proof for this class by Heinrich Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. Here we reconstruct the results from Behmann's publication in detail and discuss related issues that are relevant in the context of modern approaches to second-order quantifier elimination in computational logic. In addition, we give a summary as well as a commented register of further relevant material in unpublished manuscripts by Behmann and in his correspondence, especially with Wilhelm Ackermann.

Some Fragments Towards Establishing Completeness Properties of Second-Order Quantifier Elimination Methods
Christoph Wernhard
Poster presentation at Jahrestreffen der GI Fachgruppe Deduktionssysteme, associated with CADE-25, 2015.
bib | .pdf ]

Second-Order Quantifier Elimination on Relational Monadic Formulas - A Basic Method and Some Less Expected Applications
Christoph Wernhard
In Hans de Nivelle, editor, Automated Reasoning with Analytic Tableaux and Related Methods: 24th International Conference, TABLEAUX 2015, volume 9323 of LNCS (LNAI), pages 249-265. Springer, 2015.
bib ]

For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, forgetting and projection, always succeeds. The decidability proof for this class by Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. We reconstruct Behmann's method, relate it to the modern DLS elimination algorithm and show some applications where the essential monadicity becomes apparent only at second sight. In particular, deciding ALCOQH knowledge bases, elimination in DL-Lite knowledge bases, and the justification of the success of elimination methods for Sahlqvist formulas.

Second-Order Quantifier Elimination on Relational Monadic Formulas - A Basic Method and Some Less Expected Applications (Extended Version)
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 15-04, Technische Universität Dresden, 2015.
bib | .pdf ]

For relational monadic formulas (the Löwenheim class) second-order quantifier elimination, which is closely related to computation of uniform interpolants, forgetting and projection, always succeeds. The decidability proof for this class by Behmann from 1922 explicitly proceeds by elimination with equivalence preserving formula rewriting. We reconstruct Behmann's method, relate it to the modern DLS elimination algorithm and show some applications where the essential monadicity becomes apparent only at second sight. In particular, deciding ALCOQH knowledge bases, elimination in DL-Lite knowledge bases, and the justification of the success of elimination methods for Sahlqvist formulas.

Second-Order Characterizations of Definientia in Formula Classes
Christoph Wernhard
In Alexander Bolotov and Manfred Kerber, editors, Joint Automated Reasoning Workshop and Deduktionstreffen (ARW-DT 2014), pages 36-37, 2014.
Workshop presentation [ Slides ].
bib | .pdf ]

Application Patterns of Projection/Forgetting
Christoph Wernhard
In Laura Kovacs and Georg Weissenbacher, editors, Workshop on Interpolation: From Proofs to Applications (iPRA 2014), 2014.
Workshop presentation [ Slides ].
bib | .pdf ]

Second-Order Characterizations of Definientia in Formula Classes
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 14-03, Technische Universität Dresden, 2014.
bib | .pdf ]

Predicate quantification can be applied to characterize definientia of a given formula that are in terms of a given set of predicates. Methods for second-order quantifier elimination and the closely related computation of forgetting, projection and uniform interpolants can then be applied to compute such definientia. Here we address the question, whether this principle can be transferred to definientia in given classes that allow efficient processing, such as Horn or Krom formulas. Indeed, if propositional logic is taken as basis, for the class of all formulas that are equivalent to a conjunction of atoms and the class of all formulas that are equivalent to a Krom formula, the existence of definientia as well as representative definientia themselves can be characterized in terms of predicate quantification. For the class of formulas that are equivalent to a Horn formula, this is possible with a special further operator. For first-order logic as basis, we indicate guidelines and open issues.

Semantik, Linked Data, Web-Präsentation: Grundlagen der Nachlasserschließung im Portal www.pueckler-digital.de
Jana Kittelmann and Christoph Wernhard
In Anne Baillot and Anna Busch, editors, Workshop Datenmodellierung in digitalen Briefeditionen und ihre interpretatorische Leistung. Humboldt-Universität zu Berlin, 2014.
Poster presentation, [ Abstract | Poster ].
bib ]

Expressing View-Based Query Processing and Related Approaches with Second-Order Operators
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 14-02, Technische Universität Dresden, 2014.
bib | .pdf ]

Modeling the Suppression Task under Weak Completion and Well-Founded Semantics
Emmanuelle-Anna Dietz, Steffen Hölldobler, and Christoph Wernhard
Journal of Applied Non-Classsical Logics, 24(1-2):61-85, 2014.
bib ]

Computing with Logic as Operator Elimination: The ToyElim System
Christoph Wernhard
In Hans Tompits, Salvador Abreu, Johannes Oetsch, Jörg Pührer, Dietmar Seipel, Masanobu Umeda, and Armin Wolf, editors, Applications of Declarative Programming and Knowledge Management, 19th International Conference (INAP 2011) and 25th Workshop on Logic Programming(WLP 2011), Revised Selected Papers, volume 7773 of LNCS (LNAI), pages 289-296. Springer, 2013.
bib ]

A prototype system is described whose core functionality is, based on propositional logic, the elimination of second-order operators, such as Boolean quantifiers and operators for projection, forgetting and circumscription. This approach allows to express many representational and computational tasks in knowledge representation - for example computation of abductive explanations and models with respect to logic programming semantics - in a uniform operational system, backed by a uniform classical semantic framework.

Semantik, Web, Metadaten und digitale Edition: Grundlagen und Ziele der Erschließung neuer Quellen des Branitzer Pückler-Archivs
Jana Kittelmann and Christoph Wernhard
In Irene Krebs et al., editors, Resonanzen. Pücklerforschung im Spannungsfeld zwischen Wissenschaft und Kunst. Ein Konferenzbericht., pages 179-202. trafo Verlag, Berlin, 2013.
bib ]

Möglichkeiten der Repräsentation von Nachlassarchiven im Web
Jana Kittelmann and Christoph Wernhard
In Fachtagung Semantische Technologien - Verwertungsstrategien und Konvergenz, Humboldt-Universität zu Berlin, Position Papers, 2013.
bib | http ]

Abduction in Logic Programming as Second-Order Quantifier Elimination
Christoph Wernhard
In Pascal Fontaine, Christophe Ringeissen, and Renate A. Schmidt, editors, 9th International Symposium on Frontiers of Combining Systems, FroCoS 2013, volume 8152 of LNCS (LNAI), pages 103-119. Springer, 2013.
bib ]

It is known that skeptical abductive explanations with respect to classical logic can be characterized semantically in a natural way as formulas with second-order quantifiers. Computing explanations is then just elimination of the second-order quantifiers. By using application patterns and generalizations of second-order quantification, like literal projection, the globally weakest sufficient condition and circumscription, we transfer these principles in a unifying framework to abduction with three non-classical semantics of logic programming: stable model, partial stable model and well-founded semantics. New insights are revealed about abduction with the partial stable model semantics.

Soundness of Inprocessing in Clause Sharing SAT Solvers
Norbert Manthey, Tobias Philipp, and Christoph Wernhard
In Matti Järvisalo and Allen Van Gelder, editors, Theory and Applications of Satisfiability Testing, 16th International Conference, SAT 2013, volume 7962 of LNCS, pages 22-39. Springer, 2013.
Received the SAT 2013 Best Paper Award (Manuscript: http://cs.christophwernhard.com/papers/inprocessing.pdf).
bib ]

We present a formalism that models the computation of clause sharing portfolio solvers with inprocessing. The soundness of these solvers is not a straightforward property since shared clauses can make a formula unsatisfiable. Therefore, we develop characterizations of simplification techniques and suggest various settings how clause sharing and inprocessing can be combined. Our formalization models most of the recent implemented portfolio systems and we indicate possibilities to improve these. A particular improvement is a novel way to combine clause addition techniques - like blocked clause addition - with clause deletion techniques - like blocked clause elimination or variable elimination.

Towards a Declarative Approach to Model Human Reasoning with Nonmonotonic Logics
Christoph Wernhard
In Thomas Barkowsky, Marco Ragni, and Frieder Stolzenburg, editors, Human Reasoning and Automated Deduction: KI 2012 Workshop Proceedings, volume SFB/TR 8 Report 032-09/2012 of Report Series of the Transregional Collaborative Research Center SFB/TR 8 Spatial Cognition, pages 41-48. Universität Bremen / Universität Freiburg, Germany, 2012.
bib ]

Stenning and van Lambalgen introduced an approach to model empirically studied human reasoning with nonmonotonic logics. Some of the research questions that have been brought up in this context concern the interplay of the open- and closed-world assumption, the suitability of particular logic programming semantics for the modeling of human reasoning, and the role of three-valued logic programming semantics and three-valued logics. We look into these questions from the view of a framework where logic programs that model human reasoning are represented declaratively and are mechanizable by classical formulas extended with certain second-order operators.

Projection and Scope-Determined Circumscription
Christoph Wernhard
Journal of Symbolic Computation, 47:1089-1108, 2012.
bib | DOI ]

We develop a semantic framework that extends first-order logic by literal projection and a novel second semantically defined operator, “raising”, which is only slightly different from literal projection and can be used to define a generalization of parallel circumscription with varied predicates in a straightforward and compact way. We call this variant of circumscription “scope-determined”, since like literal projection and raising its effects are controlled by a so-called “scope”, that is, a set of literals, as parameter. We work out formally a toolkit of propositions about projection, raising and circumscription and their interaction. It reveals some refinements of and new views on previously known properties. In particular, we apply it to show that well-foundedness with respect to circumscription can be expressed in terms of projection, and that a characterization of the consequences of circumscribed propositional formulas in terms of literal projection can be generalized to first-order logic and expressed compactly in terms of new variants of the strongest necessary and weakest sufficient condition.

Forward Human Reasoning Modeled by Logic Programming Modeled by Classical Logic with Circumscription and Projection
Christoph Wernhard
Technical Report Knowledge Representation and Reasoning 11-07, Technische Universität Dresden, 2011.
bib | .pdf ]

Recently an approach to model human reasoning as studied in cognitive science by logic programming, has been introduced by Stenning and van Lambalgen and exemplified with the suppression task. We investigate this approach from the view of a framework where different logic programming semantics correspond to different translations of logic programs into formulas of classical two-valued logic extended by two second-order operators, circumscription and literal projection. Based on combining and extending previously known such renderings of logic programming semantics, we take semantics into account that have not yet been considered in the context of human reasoning, such as stable models and partial stable models. To model human reasoning, it is essential that only some predicates can be subjected to closed world reasoning, while others are handled by open world reasoning. In our framework, variants of familiar logic programing semantics that are extended with this feature are derived from a generic circumscription based representation. Further, we develop a two-valued representation of a three-valued logic that renders semantics considered for human reasoning based on the Fitting operator.

Computing with Logic as Operator Elimination: The ToyElim System
Christoph Wernhard
In Proceedings of the 25th Workshop on Logic Programming, WLP 2011, Infsys Research Report 1843-11-06, pages 94-98. Technische Universität Wien, 2011.
bib | http ]

A prototype system is described whose core functionality is, based on propositional logic, the elimination of second-order operators, such as Boolean quantifiers and operators for projection, forgetting and circumscription. This approach allows to express many representational and computational tasks in knowledge representation - for example computation of abductive explanations and models with respect to logic programming semantics - in a uniform operational system, backed by a uniform classical semantic framework.

An Abductive Model for Human Reasoning
Steffen Hölldobler, Tobias Philipp, and Christoph Wernhard
In Logical Formalizations of Commonsense Reasoning, Papers from the AAAI 2011 Spring Symposium, AAAI Spring Symposium Series Technical Reports, pages 135-138. AAAI Press, 2011.
bib ]

In this paper we contribute to bridging the gap between human reasoning as studied in Cognitive Science and commonsense reasoning based on formal logics and formal theories. In particular, the suppression task studied in Cognitive Science provides an interesting challenge problem for human reasoning based on logic. The work presented in the paper is founded on the recent approach by Stenning and van Lambalgen to model human reasoning by means of logic programs with a specific three-valued completion semantics and a semantic fixpoint operator that yields a least model, as well as abduction. Their approach has been subsequently made more precise and technically accurate by switching to three-valued Lukasiewicz logic. In this paper, we extend this refined approach by abduction. We show that the inclusion of abduction permits to adequately model additional empiric results reported from Cognitive Science. For the arising abductive reasoning tasks we give complexity results. Finally, we outline several open research issues that emerge from the application of logic to model human reasoning.

Circumscription and Projection as Primitives of Logic Programming
Christoph Wernhard
In M. Hermenegildo and T. Schaub, editors, Technical Communications of the 26th International Conference on Logic Programming, ICLP'10, volume 7 of Leibniz International Proceedings in Informatics (LIPIcs), pages 202-211, Dagstuhl, Germany, 2010. Schloss Dagstuhl-Leibniz-Zentrum für Informatik.
bib | DOI | http ]

We pursue a representation of logic programs as classical first-order sentences. Different semantics for logic programs can then be expressed by the way in which they are wrapped into - semantically defined - operators for circumscription and projection. (Projection is a generalization of second-order quantification.) We demonstrate this for the stable model semantics, Clark's completion and a three-valued semantics based on the Fitting operator. To represent the latter, we utilize the polarity sensitiveness of projection, in contrast to second-order quantification, and a variant of circumscription that allows to express predicate minimization in parallel with maximization. In accord with the aim of an integrated view on different logic-based representation techniques, the material is worked out on the basis of first-order logic with a Herbrand semantics.

Literal Projection and Circumscription
Christoph Wernhard
In Nicolas Peltier and Viorica Sofronie-Stokkermans, editors, Proceedings of the 7th International Workshop on First-Order Theorem Proving, FTP'09, volume 556 of CEUR Workshop Proceedings, pages 60-74. CEUR-WS.org, 2010.
bib | http ]

We develop a formal framework intended as a preliminary step for a single knowledge representation system that provides different representation techniques in a unified way. In particular we consider first-order logic extended by techniques for second-order quantifier elimination and non-monotonic reasoning. In this paper two independent results are developed. The background for the first result is literal projection, a generalization of second-order quantification which permits, so to speak, to quantify upon an arbitrary sets of ground literals, instead of just (all ground literals with) a given predicate symbol. We introduce an operator raise that is only slightly different from literal projection and can be used to define a generalization of predicate circumscription in a straightforward and compact way. We call this variant of circumscription scope-determined. Some properties of raise and scope-determined circumscription, also in combination with literal projection, are then shown. A previously known characterization of consequences of circumscribed formulas in terms of literal projection is generalized from propositional to first-order logic and proven on the basis of the introduced concepts. The second result developed in this paper is a characterization stable models in terms of circumscription. Unlike traditional characterizations, it does not recur onto syntactic notions like reduct and fixed-point construction. It essentially renders a recently proposed “circumscription-like” characterization in a compact way, without involvement of a non-classically interpreted connective.

Tableaux for Projection Computation and Knowledge Compilation
Christoph Wernhard
In Martin Giese and Arild Waaler, editors, Automated Reasoning with Analytic Tableaux and Related Methods: 18th International Conference, TABLEAUX 2009, volume 5607 of LNCS (LNAI), pages 325-340. Springer, 2009.
bib | DOI | .pdf ]

Projection computation is a generalization of second-order quantifier elimination, which in turn is closely related to the computation of forgetting and of uniform interpolants. On the basis of a unified view on projection computation and knowledge compilation, we develop a framework for applying tableau methods to these tasks. It takes refinements from performance oriented systems into account. Formula simplifications are incorporated at the level of tableau structure modification, and at the level of simplifying encountered subformulas that are not yet fully compiled. In particular, such simplifications can involve projection computation, where this is possible with low cost. We represent tableau construction by means of rewrite rules on formulas, extended with some auxiliary functors, which is particularly convenient for formula transformation tasks. As instantiations of the framework, we discuss approaches to propositional knowledge compilation from the literature, including adaptions of DPLL, and the hyper tableau calculus for first-order clauses.

Automated Deduction for Projection Elimination
Christoph Wernhard
Number 324 in Dissertations in Artificial Intelligence. AKA Verlag/IOS Press, Heidelberg, Amsterdam, 2009.
bib | http ]

Projection is a logic operation which allows to express tasks in knowledge representation. These tasks involve extraction or removal of knowledge concerning a given sub-vocabulary. It is a generalization of second-order quantification, permitting, so to speak, to `quantify' upon an arbitrary set of ground literals instead of just (all ground literals with) a given predicate symbol. In Automated Deduction for Projection Elimination, a semantic characterization of projection for first-order logic is presented. On this basis, properties underlying applications and processing methods are derived. The computational processing of projection, called projection elimination in analogy to quantifier elimination, can be performed by adapted theorem proving methods. This is shown for resolvent generation and, more in depth, tableau construction. An abstract framework relates projection elimination with knowledge compilation and shows the adaption of key features of high performance tableau systems. As a prototypical instance, an adaption of a modern DPLL method, such as underlying state-of-the-art SAT solvers, is worked out. It generalizes various recent knowledge compilation methods and utilizes the interplay with projection elimination for efficiency improvements.

HydraSAT 2009.3 Solver Description
Christoph Baldow, Friedrich Gräter, Steffen Hölldobler, Norbert Manthey, Max Seelemann, Peter Steinke, Christoph Wernhard, Konrad Winkler, and Erik Zenker
In Daniel Le Berre et al., editor, SAT 2009 Competitive Event Booklet, pages 15-16, 2009.
bib | .pdf ]

Literal Projection for First-Order Logic
Christoph Wernhard
In Steffen Hölldobler, Carsten Lutz, and Heinrich Wansing, editors, Logics in Artificial Intelligence: 11th European Conference, JELIA 08, volume 5293 of LNCS (LNAI), pages 389-402. Springer, 2008.
bib | DOI ]

The computation of literal projection generalizes predicate quantifier elimination by permitting, so to speak, quantifying upon an arbitrary sets of ground literals, instead of just (all ground literals with) a given predicate symbol. Literal projection allows, for example, to express predicate quantification upon a predicate just in positive or negative polarity. Occurrences of the predicate in literals with the complementary polarity are then considered as unquantified predicate symbols. We present a formalization of literal projection and related concepts, such as literal forgetting, for first-order logic with a Herbrand semantics, which makes these notions easy to access, since they are expressed there by means of straightforward relationships between sets of literals. With this formalization, we show properties of literal projection which hold for formulas that are free of certain links, pairs of literals with complementary instances, each in a different conjunct of a conjunction, both in the scope of a universal first-order quantifier, or one in a subformula and the other in its context formula. These properties can justify the application of methods that construct formulas without such links to the computation of literal projection. Some tableau methods and direct methods for second-order quantifier elimination can be understood in this way.

System Description: E-KRHyper
Björn Pelzer and Christoph Wernhard
In Frank Pfennig, editor, Automated Deduction: CADE-21, volume 4603 of LNCS (LNAI), pages 503-513. Springer, 2007.
bib | DOI | .pdf ]

The E-KRHyper system is a model generator and theorem prover for first-order logic with equality. It implements the new E-hyper tableau calculus, which integrates a superposition-based handling of equality into the hyper tableau calculus. E-KRHyper extends our previous KRHyper system, which has been used in a number of applications in the field of knowledge representation. In contrast to most first order theorem provers, it supports features important for such applications, for example queries with predicate extensions as answers, handling of large sets of uniformly structured input facts, arithmetic evaluation and stratified negation as failure. It is our goal to extend the range of application possibilities of KRHyper by adding equality reasoning.

Tableaux Between Proving, Projection and Compilation
Christoph Wernhard
Technical Report Arbeitsberichte aus dem Fachbereich Informatik 18/2007, Universität Koblenz-Landau, Institut für Informatik, Universitätsstr. 1, 56070 Koblenz, Germany, 2007.
bib ]

Generalized methods for automated theorem proving can be used to compute formula transformations such as projection elimination and knowledge compilation. We present a framework based on clausal tableaux suited for such tasks. These tableaux are characterized independently of particular construction methods, but important features of empirically successful methods are taken into account, especially dependency directed backjumping and branch local operation. As an instance of that framework an adaption of DPLL is described. We show that knowledge compilation methods can be essentially improved by weaving projection elimination partially into the compilation phase.

Semantic Knowledge Partitioning
Christoph Wernhard
In José Júlio Alferes and João Leite Leite, editors, Logics in Artificial Intelligence: 9th European Conference, JELIA 04, volume 3229 of LNCS (LNAI), pages 552-564. Springer, 2004.
bib | .ps ]

Some operations to decompose a knowledge base (considered as a first order logic formula) in ways so that only its semantics determines the results are investigated. Intended uses include the extraction of “parts” relevant to an application, the exploration and utilizing of implicit possibilities of structuring a knowledge base and the formulation of query answers in terms of a signature demanded by an application. A semantic framework based on Herbrand interpretations is outlined. The notion of ´model relative to a scope¡ is introduced. It underlies the partitioning operations “projection” and “forgetting” and also provides a semantic account for certain formula simplification operations. An algorithmic approach which is based on resolution and may be regarded as a variation of the SCAN algorithm is discussed.

Semantic Knowledge Partitioning (Extended Abstract)
Christoph Wernhard
In Ulrike Sattler, editor, Contributions to the Doctoral Programme of the Second International Joint Conference on Automated Reasoning, IJCAR 2004, volume 106 of CEUR Workshop Proceedings. CEUR-WS.org, 2004.
bib | http ]

KRHyper Inside - Model Based Deduction in Applications
Peter Baumgartner, Ulrich Furbach, Margret Gross-Hardt, Thomas Kleemann, and Christoph Wernhard
In Proceedings of the CADE-19 workshop Challenges and Novel Applications for Automated Reasoning, pages 55-72, 2003.
bib | .ps ]

Three real world applications are depicted which all have a full first order theorem prover based on the hyper tableau calculus as their core component. These applications concern information retrieval in electronic publishing, the integration of description logics with other knowledge representation techniques and XML query processing.

System Description: KRHyper
Christoph Wernhard
Technical Report Fachberichte Informatik 14-2003, Universität Koblenz-Landau, Institut für Informatik, Universitätsstr. 1, 56070 Koblenz, Germany, 2003.
Presented at the CADE-19 workshop Model Computation: Principles, Algorithms, Applications.
bib | .ps ]

KRHyper is a first order logic theorem proving and model generation system based on the hyper tableau calculus. It is targeted for use as an embedded system within knowledge based applications. In contrast to most first order theorem provers, it supports features important for those applications, for example queries with predicate extensions as answers, handling of large sets of uniformly structured input facts, arithmetic evaluation and stratified negation as failure.

Using Mathematica and Automated Theorem Provers to Access a Mathematical Library
Ingo Dahn, Andreas Haida, Thomas Honigmann, and Christoph Wernhard
In Proceedings of the CADE-15 Workshop on Integration of Deductive Systems, pages 36-43, 1998.
(Revised and extended version: http://cs.christophwernhard.com/papers/integ.ps).
bib | .ps.gz ]

We describe a concept for the cooperation of a computer algebra system, several automated theorem provers and a mathematical library. The purpose of this cooperation is to enable intelligent retrieval of theorems and definitions from the remote mathematical library. Automated theorem provers compete on remote machines to verify conjectures provided by the user in a local copy of Mathematica. They make use of a remote knowledge base which contains parts of the Mizar Mathematical Library.

First Order Proof Problems Extracted from an Article in the Mizar Mathematical Library
Ingo Dahn and Christoph Wernhard
In International Workshop on First-Order Theorem Proving, FTP'97, RISC-Linz Report Series No. 97-50, pages 58-62. Johannes Kepler Universität, Linz, 1997.
bib | .ps ]

Over the years, interactive theorem provers have built a large body of verified computer mathematics. The ILF Mathematical Library aims to make this knowledge available to other systems. One of the reasons for such a project is economy. Verification of software and hardware frequently requires the proof of purely mathematical theorems. It is obviously inefficient, to use the time of experts in the design of software or hardware systems to prove such theorems. Another reason for presenting a collection of mathematical theorems in a unified framework is safety. It should facilitate the verification of theorems in the library of one system by other systems. A third reason is dynamics of research. New interactive theorem provers should obtain the possibility to show their usability for real-world problems without having to reprove elementary mathematical facts. Last but not least, it is hoped that reproving theorems in a uniform mathematical library will be considered as a challenge to the development of automated theorem provers.

DB-CLOS: Eine Datenbankschnittstelle für das Common Lisp Object System
Heinz Schweppe, Christoph Wernhard, and Jutta Estenfeld
In W. Remmele, editor, Künstliche Intelligenz in der Praxis. Siemens AG, 1990.
bib ]

Unpublished Material

InfraEngine: Inferencing in the Semantic Web by Planning
Christoph Wernhard
System description (edited 2007). (A prototype implementation can be downloaded from http://www.infraengine.com/), 2002.
bib | .pdf ]

The idea of InfraEngine is to help establishing a Semantic Web infrastructure by a specially adapted AI planning engine. Inputs are distributed Web documents in Semantic Web formats proposed by the World Wide Web Consortium. Outputs are also delivered in such formats. The user interface allows to browse Semantic Web data and to control the planning services from any Web browser. Also other programs can make use of these services. A small and understandable, but general, mechanism that can be used for different kinds of applications should be provided.

Representing Proofs in the Semantic Web
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany, 2001.
bib | .pdf ]

First steps towards an RDF format for exchanging proofs in the Semantic Web.

Two Short Term Applications of the Semantic Web
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany, 2001.
bib | .pdf ]

We outline two application scenarios which might be in the realm of short term applications of the Semantic Web: a software packaging system and the organization of a business trip. Both of them can be solved with today's technology to some degree, so they do not show the novel potential of the Semantic Web in full. However considering Semantic Web solutions for them is useful to get a picture of the characteristics of the Semantic Web and become aware of some concrete technical issues.

The Planning Web in Action
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany, 2000.
bib | .pdf ]

We propose resource oriented inference as one way of bringing the Semantic Web into action. It provides a framework for expressing and processing a variety of tasks from areas such as planning, scheduling, manufacturing resource planning, product data management, configuration management, workflow management and simulation. Resource oriented inference as a part of the Semantic Web should allow such tasks to be performed within the scope of the World Wide Web. A prototypical application is be the purchase of a complex product with the help of the Web. The product consists of multiple parts, some of them complex products by themselves. Several services are required to compose the product. Subparts and services can be provided by different companies all over the world. Delivery time and cost of the product should be optimized.

Towards a Semantic Web Modeling Language
Christoph Wernhard
Working paper for Persist AG, Teltow, Germany. Presented at KnowTech 2000, Leipzig, 2000.
bib | .pdf ]

Outline of a small Web embedded language for specifying types and reasoning about them. Its main features are: (1.) From primitive types, that require from their instances just that they implement a single method, complex types can be constructed by a few operators, such as as a type intersection. This composition of type definitions maps to the composition of fragments of type definitions in different Web documents. (2.) Types are compared by structural equivalence and not based on their names. This facilitates combination of independently developed models and implicit association of types with given instances. (3.) The hyperlinking of expressions of the modeling language can be understood in terms of defining equations of a rewriting system. The left side of such a rewrite rule is an URI, the right side an expression. Hyperlink dereferencing corresponds to a rewrite step with such a rule.

Experiments with a Linear Backward Chaining Planner
Christoph Wernhard
(Edited May 2003), 1999.
bib | .ps ]

The performance of an implementation of the linear backward chaining planning algorithm is compared to other planning systems by means of the problem set of the first ai planning systems competition (AIPS-98).

Entwurf und Implementierung einer Datenbankschnittstelle für das Common Lisp Object System
Christoph Wernhard
Magisterarbeit, Freie Universität Berlin, Berlin, Germany, 1992.
(The system documentation is available from http://cs.christophwernhard.com/cdb/.).
bib ]

Home Page