Chair for Automata Theory of the Institute for Theoretical Computer Science, Faculty of Computer Science at TU Dresden

Habilitation and PhD Theses supervised by Prof. Franz Baader

Habilitation Theses

PhD Theses


There are also lists of our papers and technical reports.

Habilitation Theses


R. Peñaloza

Reasoning with Annotated Description Logic Ontologies

Download (0.41MB, PDF)


A.-Y. Turhan

Reasoning Services for the Maintenance and Flexible Access to Description Logic Ontologies

Download (5.9MB, PDF)


C. Lutz



content of abstract

Download (?.??MB, PDF)


U. Sattler

Description Logics for Ontologies


Description Logics (DLs) are a family of knowledge representation formalisms designed for the representation of terminological knowledge. A DL knowledge base consists (at least) of a set of concept definitions, namely of those concepts that are relevant for the specific application. Standard inference services provided by DL-based knowledge representation systems include tests whether each defined concept is satisfiable and the computation of the subsumption hierarchy of the defined concepts, i.e., of the specialisation relation between the defined oncepts.

Besides the well-defined semantics of DLs, these inference services make DLs suitable candidates for ontology languages, which have become of increasing importance due to the amount of information available electronically and the vision of the semantic web. For a variety of DLs, decision procedures, tight complexity bounds, and practical inference algorithms for the corresponding inference problems are known. It is clear that, to be of use as an ontology language, a description logic has to provide adequate expressive power, and we are thus concerned with the well-known trade-off between complexity and expressiveness.

After a brief introduction to ontologies, we introduce the basic description logic ALC and describe how DLs can be used as ontology languages. Next, we sketch the relationship between DLs and other formalisms such as first order and modal logic and data base conceptual models. To give a broader view of DLs, some standard expressive means in DLs are mentioned as well as their modal logic counterparts and their effect on the complexity of the inference problems. In Section 3, we give an intuitive explanation of standard reasoning techniques employed for DLs and discuss their respective advantages: tableau-based algorithms turned out to be well-suited for implementations, whereas automata-based algorithms yield elegant upper complexity bounds for Exptime logics. In many cases, first a DL was proven to be in Exptime using automata before a tableau-based algorithm was designed and implemented.

Having thus introduced description logics and how they can be used as ontology languages, in Section 4, we describe how we have designed the rather successful DLs SHIQ and RIQ: we first observe that ALC lacks the expressive power to describe aggregated objects using a transitive part-whole relation, and then extend ALC with a new constructor that overcomes this expressive shortcoming while still allowing for a practical, tableau-based inference algorithm. Step by step, we further extend the resulting DLs with new constructors that were chosen according to the same design goal: to overcome expressive shortcomings while allowing for practical inference algorithms. For each extension, we describe how we have modified the tableau algorithm to take into account the new constructor.

In Section 5, we are concerned with hybrid logics, i.e., description and modal logics that allow to refer to single individuals using nominals. Nominals are a rather special constructor since they destroy a nice model theoretic property that most DLs enjoy, namely the tree model property. Despite this effect, we were able to show, for two example hybrid logics, that automata on trees can still be used to decide satisfiability of hybrid DLs and thus provide tight upper complexity bounds. To this purpose, we use a certain abstraction technique from (non)-tree models to tree structures. This technique turns out to be applicable also for tableau algorithms: we have used it to devise a tableau algorithm for the extension of (a restriction of) SHIQ with nominals.

Download (0.34MB, PDF)

PhD theses


M. Lippmann

Temporalised Description Logics for Monitoring Partially Observable Events


Inevitably, it becomes more and more important to verify that the systems surrounding us have certain properties. This is indeed unavoidable for safety-critical systems such as power plants and intensive-care units. We refer to the term system in a broad sense: it may be man-made (e.g. a computer system) or natural (e.g. a patient in an intensive-care unit). Whereas in Model Checking it is assumed that one has complete knowledge about the functioning of the system, we consider an open-world scenario and assume that we can only observe the behaviour of the actual running system by sensors. Such an abstract sensor could sense e.g. the blood pressure of a patient or the air traffic observed by radar.

Then the observed data are preprocessed appropriately and stored in a fact base. Based on the data available in the fact base, situation-awareness tools are supposed to help the user to detect certain situations that require intervention by an expert. Such situations could be that the heart-rate of a patient is rather high while the blood pressure is low, or that a collision of two aeroplanes is about to happen.

Moreover, the information in the fact base can be used by monitors to verify that the system has certain properties. It is not realistic, however, to assume that the sensors always yield a complete description of the current state of the observed system. Thus, it makes sense to assume that information that is not present in the fact base is unknown rather than false. Moreover, very often one has some knowledge about the functioning of the system. This background knowledge can be used to draw conclusions about the possible future behaviour of the system. Employing description logics (DLs) is one way to deal with these requirements. In this thesis, we tackle the sketched problem in three different contexts: (i) runtime verification using a temporalised DL, (ii) temporalised query entailment, and (iii) verification in DL-based action formalisms.

Electronic publication available here from the Electronic Publication Archive of the Saxon State and University Library Dresden.

S. Borgwardt

Fuzzy Description Logics with General Concept Inclusions


Description logics (DLs) are used to represent knowledge of an application domain and provide standard reasoning services to infer consequences of this knowledge. However, classical DLs are not suited to represent vagueness in the description of the knowledge. We consider a combination of DLs and Fuzzy Logics to address this task. In particular, we consider the t-norm-based semantics for fuzzy DLs introduced by Hajek in 2005. Since then, many tableau algorithms have been developed for reasoning in fuzzy DLs. Another popular approach is to reduce fuzzy ontologies to classical ones and use existing highly optimized classical reasoners to deal with them. However, a systematic study of the computational complexity of the different reasoning problems is so far missing from the literature on fuzzy DLs. Recently, some of the developed tableau algorithms have been shown to be incorrect in the presence of general concept inclusion axioms (GCIs). In some fuzzy DLs, reasoning with GCIs has even turned out to be undecidable. This work provides a rigorous analysis of the boundary between decidable and undecidable reasoning problems in t-norm-based fuzzy DLs, in particular for GCIs. Existing undecidability proofs are extended to cover large classes of fuzzy DLs, and decidability is shown for most of the remaining logics considered here. Additionally, the computational complexity of reasoning in fuzzy DLs with semantics based on finite lattices is analyzed. For most decidability results, tight complexity bounds can be derived.

Electronic publication available here from the Electronic Publication Archive of the Saxon State and University Library Dresden.


F. Distel

Learning Description Logic Knowledge Bases from Data Using Methods from Formal Concept Analysis


Description Logics (DLs) are a class of knowledge representation formalisms that can represent terminological and assertional knowledge using a well-defined semantics. Often, knowledge engineers are experts in their own fields, but not in logics, and require assistance in the process of ontology design. This thesis presents three methods that can extract terminological knowledge from existing data and thereby assist in the design process. They are based on similar formalisms from Formal Concept Analysis (FCA), in particular the Next-Closure Algorithm and Attribute-Exploration. The first of the three methods computes terminological knowledge from the data, without any expert interaction. The two other methods use expert interaction, where a human expert can confirm each terminological axiom or refute it by providing a counterexample. These two methods differ only in the way counterexamples are provided.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (1.3MB, PDF)


M. Knechtel

Access Restrictions to and with Description Logic Web Ontologies


Access restrictions are essential in standard information systems and became an issue for ontologies in the following two aspects. Ontologies can represent explicit and implicit knowledge about an access policy. For this aspect we provided a methodology to represent and systematically complete role-based access control policies. Orthogonally, an ontology might be available for limited reading access. Independently of a specific ontology language or reasoner, we provided a lattice-based framework to assign labels to an ontology's axioms and consequences. We looked at the problems to compute and repair one or multiple consequence labels and to assign a query-based access restriction. An empirical evaluation has shown that the algorithms perform well in practical scenarios with large-scale ontologies.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (3.79MB, PDF), slides

H. Liu

Computing Updates in Description Logics


Description Logics (DLs) form a family of knowledge representation formalisms which can be used to represent and reason with conceptual knowledge about a domain of interest. The knowledge represented by DLs is mainly static. In many applications, the domain knowledge is dynamic. This observation motivates the research on how to update the knowledge when changes in the application domain take place. This thesis is dedicated to the study of updating knowledge, more precisely, assertional knowledge represented in DLs. We explore whether the updated knowledge can be expressed in several standard DLs and, if so, whether it is computable and what is its size.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (1.48MB, PDF)


R. Peñaloza

Axiom-Pinpointing in Description Logics and Beyond


Building and mantaining large-scale ontologies is an error-prone task. It is thus not uncommon to find unwanted or unexpected consequences that follow implicitely from the restrictions in the ontology. To understand and correct these consequences, it is helpful to find the specific portions of the ontology that are responsible for them. Axiom-pinpointing is the task of finding minimal subontologies that entail a given consequence, also called MinAs. In this work we look at the task of computing all the MinAs by means of modified decision procedures. We first show that tableaux- and automata-based decision procedures can be transformed into pinpointing algorithms that output a (compact) representation of the set of all MinAs. We then explore the complexity of the problem.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (0.97MB, PDF)

B. Suntisrivaraporn

Polynomial-Time Reasoning Support for Design and Maintenance of Large-Scale Biomedical Ontologies


Description Logics (DLs) belong to a successful family of knowledge representation formalisms with two key assets: formally well-defined semantics which allows to represent knowledge in an unambiguous way and automated reasoning which allows to infer implicit knowledge from the one given explicitly. This thesis investigates various reasoning techniques for tractable DLs in the EL family which have been implemented in the CEL system. It suggests that the use of the lightweight DLs, in which reasoning is tractable, is beneficial for ontology design and maintenance both in terms of expressivity and scalability. The claim is supported by a case study on the renown medical ontology SNOMED CT and extensive empirical evaluation on several large-scale biomedical ontologies.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (1.59MB, PDF)


M. Milicic.

Action, Time and Space in Description Logics


Description Logics (DLs) are a family of logic-based knowledge representation (KR) formalisms designed to represent and reason about static conceptual knowledge in a semantically well-understood way. On the other hand, standard action formalisms are KR formalisms based on classical logic designed to model and reason about dynamic systems. The largest part of the present work is dedicated to integrating DLs with action formalisms, with the main goal of obtaining decidable action formalisms with an expressiveness significantly beyond propositional. To this end, we offer DL-tailored solutions to the frame and ramification problem. The main technical result is that standard reasoning problems about actions (executability and projection), as well as the plan existence problem are decidable if one restricts the logic for describing action pre- and post-conditions and the state of the world to decidable Description Logics. Moreover, reasoning about DL actions is reducible to standard DL reasoning problems, with the nice consequence that already available DL reasoners can be employed to provide automated support for reasoning about a dynamically changing world. A smaller part of the work is related to the design of a tableau algorithm for decidable extensions of Description Logics with concrete datatypes, most importantly with those allowing to refer to the notions of space and time, and with powerful domain constraints (general concept inclusions).

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (1.2MB, *.pdf)


B. Sertkaya.

Formal Concept Analysis Methods for Description Logics


Description Logics (DLs) are a class of logic based knowledge representation formalisms that are used to represent terminological knowledge of an application domain in a structured way. Formal Concept Analysis (FCA), on the other hand, is a field of applied mathematics that aims to formalize the notions of a concept and a conceptual hierarchy by means of mathematical tools. Although the notion of a concept as a collection of objects sharing certain properties, and the notion of a conceptual hierarchy are fundamental to both DLs and FCA, the ways concepts are described and obtained differ significantly between these two research areas.

In the present work we show that, despite these differences, DL research can benefit from FCA research for solving problems encountered in knowledge representation by employing FCA methods. Our contributions in this direction fall mainly under two topics: 1) supporting bottom-up construction of DL knowledge bases, 2) completing DL knowledge bases. In the first one we show how the attribute exploration method can be used for computing least common subsumers w.r.t. a background knowledge base. In the second one, we develop an extension of FCA for open-world semantics of DL knowledge bases, and show how the extended attribute exploration method can be used to detect missing axioms and individuals in a DL knowledge base. Moreover we also contribute to FCA research by investigating the computational complexity of several decision and counting problems related to minimal generators of closed sets.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (1.13 MB, pdf)

J. Hladik.

To and Fro Between Tableaus and Automata for Description Logics


Description Logics (DLs) are a family of knowledge representation languages with well-defined logic-based semantics and decidable inference problems, e.g. satisfiability. Two of the most widely used decision procedures for the satisfiability problem are tableau- and automata-based algorithms. Due to their different operation, these two classes have complementary properties: tableau algorithms are well-suited for implementation and for showing PSPACE and NEXPTIME complexity results, whereas automata algorithms are particularly useful for showing EXPTIME results. Additionally, they allow for an elegant handling of infinite structures, but they are not suited for implementation. The aim of this thesis is to analyse the reasons for these differences and to find ways of transferring properties between the two approaches in order to reconcile the positive properties of both.

In a first approach, we describe a method to translate automata into DL knowledge bases, which allows us to use tableau reasoners to decide the automata emptiness problem. Since empirical results show that this does not lead to acceptable performance in practice, we develop a way for transferring PSPACE complexity results from tableaus to automata by modifying the automata emptiness test with optimisation techniques known from tableaus, in particular by using a blocking condition. Finally, in order to transfer EXPTIME complexity results from the automata to the tableau paradigm, we develop Tableau Systems, a framework for tableau algorithms. From an algorithm formalised within this framework, it is possible to derive both an EXPTIME automata algorithm and a tableau algorithm that is useable in practice.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

A.-Y. Turhan.

On the Computation of Common Subsumers in Description Logics


Description logics (DL) knowledge bases are often build by users with expertise in the application domain, but little expertise in logic. To support this kind of users when building their knowledge bases a number of extension methods have been proposed to provide the user with concept descriptions as a starting point for new concept definitions. The inference service central to several of these approaches is the computation of (least) common subsumers of concept descriptions.

In case disjunction of concepts can be expressed in the DL under consideration, the least common subsumer (lcs) is just the disjunction of the input concepts. Such a trivial lcs is of little use as a starting point for a new concept definition to be edited by the user. To address this problem we propose two approaches to obtain "meaningful" common subsumers in the presence of disjunction tailored to two different methods to extend DL knowledge bases. More precisely, we devise computation methods for the approximation-based approach and the customization of DL knowledge bases, extend these methods to DLs with number restrictions and discuss their efficient implementation.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (1 MB, *.pdf)


S. Brandt.

Standard and Non-standard Reasoning in Description Logics


The present work deals with Description Logics (DLs), a class of knowledge representation formalisms used to represent and reason about classes of individuals and relations between such classes in a formally well-defined way. We provide novel results in three main directions. (1) Tractable reasoning revisited: in the 1990s, DL research has largely answered the question for practically relevant yet tractable DL formalisms in the negative. Due to novel application domains, especially the Life Sciences, and a surprising tractability result by Baader, we have re-visited this question, this time looking in a new direction: general terminologies (TBoxes) and extensions thereof defined over the DL EL and extensions thereof. As main positive result, we devise EL++(D)-CBoxes as a tractable DL formalism with optimal expressivity in the sense that every additional standard DL constructor, every extension of the TBox formalism, or every more powerful concrete domain, makes reasoning intractable. (2) Non-standard inferences for knowledge maintenance: non-standard inferences, such as matching, can support domain experts in maintaining DL knowledge bases in a structured and well-defined way. In order to extend their availability and promote their use, the present work extends the state of the art of non-standard inferences both w.r.t. theory and implementation. Our main results are implementations and performance evaluations of known matching algorithms for the DLs ALE and ALN, optimal non-deterministic polynomial time algorithms for matching under acyclic side conditions in ALN and sublanguages, and optimal algorithms for matching w.r.t. cyclic (and hybrid) EL-TBoxes. (3) Non-standard inferences over general concept inclusion (GCI) axioms: the utility of GCIs in modern DL knowledge bases and the relevance of non-standard inferences to knowledge maintenance naturally motivate the question for tractable DL formalism in which both can be provided. As main result, we propose hybrid EL-TBoxes as a solution to this hitherto open question.

Publisher: Available from the Electronic Publication Archive of the Saxon State and University Library Dresden as an electronic publication of the University of Dresden.

Download (1.5MB, *.pdf)


C. Lutz.

The Complexity of Description Logics with Concrete Domains


Concrete domains are an extension of Description Logics (DLs) that allows to integrate reasoning about conceptual knowledge with reasoning about "concrete qualities" of real world entities such as their age, weight, shape, and temporal extension. In this thesis, we perform an in-depth analysis of the computational complexity of reasoning with DLs that are equipped with concrete domains.

The main results are that (i) reasoning with ALC(D), the basic DL ALC extended with a concrete domain D, is PSpace-complete if reasoning with D is in PSpace; (ii) for many seemingly "harmless" extensions of ALC(D), such as the extension with acyclic TBoxes, role conjunction, and inverse roles, the complexity of reasoning leaps from PSpace-completeness to NExpTime-completeness---at least for a large class of concrete domains; and (iii) although, in general, the combination of concrete domains and general TBoxes leads to undecidability of reasoning, there exists an interesting temporal concrete domain that can be combined with general TBoxes without sacrificing decidability. This concrete domain is used to devise a rather powerful interval-based temporal Description Logic.

As a side-track, we illuminate the connection between Description Logics equipped with concrete domains and DLs providing for so-called feature agreement and disagreement constructors. Indeed, this connection turns out to be quite close: in most cases, the complexity of reasoning with a DL providing for concrete domains is identical to the complexity of reasoning with the corresponding DL that offers feature (dis)agreements.

Publisher: Available from the Hochschulbibliothek der RWTH Aachen as Elektronische Dissertation der RWTH Aachen.

Download (*.ps.gz, 650k; *.pdf, 1500k)


S. Tobies.

Complexity Results and Practical Algorithms for Logics in Knowledge Representation

Abstract Description Logics (DLs) are used in knowledge-based systems to represent and reason about terminological knowledge of the application domain in a semantically well-defined manner. In this thesis, we establish a number of novel complexity results and give practical algorithms for expressive DLs that provide different forms of counting quantifiers.

We show that, in many cases, adding local counting in the form of qualifying number restrictions to DLs does not increase the complexity of the inference problems, even if binary coding of numbers in the input is assumed. On the other hand, we show that adding different forms of global counting restrictions to a logic may increase the complexity of the inference problems dramatically.

We provide exact complexity results and a practical, tableau based algorithm for the DL SHIQ, which forms the basis of the highly optimized DL system iFaCT.

Finally, we describe a tableau algorithm for the clique guarded fragment (CGF), which we hope will serve as the basis for an efficient implementation of a CGF reasoner.

Publisher: Will be soon available from the Hochschulbibliothek der RWTH Aachen as Elektronische Dissertation der RWTH Aachen.

Download (*.ps.gz, 760k; *.pdf, 960k)


R. Küsters.

Non-Standard Inferences in Description Logics

Abstract Description Logics denote a family of knowledge representation formalisms that can be used to represent the terminological knowledge of an application domain in a structured and well-defined way. The standard inferences in these logics, like subsumption and instance checking, have been investigated in detail during the last fifteen years, both from a theoretical and a practical point of view. In some applications it has turned out, however, that the optimal support of building and maintaining large knowledge bases requires additional (non-standard) inferences, such as least common subsumer, most specific concept, and matching of concept descriptions. Some systems contain ad hoc implementations of such inferences, mostly without having defined them formally or having investigated their computational complexity. In this work, a formal basis for these inferences is established by providing precise definitions, complete algorithms, and first complexity results.

Publisher: Has appeared in the Springer LNAI-series as Volume LNAI 2100.

R. Molitor.

Unterstützung der Modellierung verfahrenstechnischer Prozesse
durch Nicht-Standardinferenzen in Beschreibungslogiken

Abstract In der Prozesstechnik ist, wie in vielen anderen Bereichen, eine strukturierte Darstellung und Speicherung des anwendungsspezifischen Wissens wünschenswert. Im Rahmen einer Kooperation zwischen dem Lehrstuhl für Prozesstechnik und dem Lehr- und Forschungsgebiet Theoretische Informatik wurde bereits nachgewiesen, dass sich Beschreibungslogiken aufgrund ihrer hohen Ausdrucksstärke und mächtigen Inferenzalgorithmen sehr gut für diese Aufgabe eignen. So ermöglichen die standardmäßig von Beschreibungslogik-Systemen bereitgestellten Inferenzdienste beispielsweise die automatische Berechnung der Spezialisierungshierarchie einer Wissensbasis. Es hat sich jedoch herausgestellt, dass sie für eine umfassende Unterstützung der Erstellung und Wartung der Wissensbasis nicht ausreichen.

In dieser Arbeit werden daher die Nicht-Standardinferenzen Least Common Subsumer, Most Specific Concept und Rewriting untersucht, die im Zusammenspiel die Definition neuer Konzepte und damit die Erweiterung und Pflege der Wissensbasis unterstützen. Die Resultate zur Existenz, Berechenbarkeit und Komplexität sowie die Entwicklung vollständiger Algorithmen zur Lösung dieser Inferenzprobleme konzentrieren sich dabei auf Beschreibungslogiken, die bereits erfolgreich in der Prozesstechnik eingesetzt werden.

Publisher: Online available from the Hochschulbibliothek der RWTH Aachen as Elektronische Dissertation der RWTH Aachen.

Download (*.ps.gz, 810k; *.pdf, 1460k)


J. Richts.

Effiziente Entscheidungsverfahren zur E-Unifikation

Abstract Im automatischen Beweisen ist der Einsatz von E-Unifikation eine Technik, um allgemein anwendbare Beweisverfahren wie Resolution oder Vervollständigung mit Spezialverfahren zu ergänzen. Diese können zwar nur sehr spezielle Probleme lösen, sind dabei aber besonders effizient. Durch den Einsatz von E-Unifikation wird die Behandlung einer Gleichungstheorie E von dem allgemeinen Beweisverfahren in ein spezielles Unifikationsverfahren verlagert.

Treten in einem Beweisproblem mehrere Gleichungstheorien E0, ... , En auf, so entsteht die Aufgabe, die einzelnen Unifikationsverfahren für diese Theorien zu einem Unifikationsverfahren für die Vereinigung der Theorien E0, ..., En zu kombinieren. Zu diesem Zweck wurden in der Literatur mehrere Kombinationsalgorithmen vorgestellt. Diese beschäftigen sich aber fast ausschließlich mit der Kombination von E-Unifikationsverfahren, die sogenannte vollständige Mengen von Unifikatoren, also Mengen von Lösungen, berechnen. In den letzten Jahren ist zusätzlich das Interesse an E-Unifikationsverfahren gestiegen, die als reine Entscheidungsverfahren arbeiten. Franz Baader und Klaus Schulz haben einen Kombinationsalgorithmus vorgestellt, der auch Entscheidungsverfahren zur E-Unifikation kombinieren kann. Dieser Algorithmus ist allerdings vornehmlich von theoretischem Interesse, da eine direkte Implementierung wegen des enorm großen Suchraums für praktische Probleme nicht geeignet ist.

In der vorliegenden Arbeit wird aufbauend auf dem Algorithmus von Baader und Schulz ein optimiertes Kombinationsverfahren für Entscheidungsverfahren zur E-Unifikation vorgestellt. Der grundlegende Gedanke dieser Optimierung ist, den Suchraum durch den Austausch von Informationen zwischen den Komponentenverfahren der beteiligten Theorien E0, ... , En zu beschränken. Um diesen Informationsaustausch spezifizieren und seine Korrektheit beweisen zu können, wird ein spezieller Formalismus, die sogenannten Constraints, eingeführt. Außerdem wird für drei ausgewählte Theorien vorgestellt, wie bekannte Entscheidungsverfahren für diese Theorien so erweitert werden können, dass sie sich an dem angesprochenen Informationsaustausch beteiligen können. Laufzeittests mit den implementieren Algorithmen zeigen, dass diese Optimierungen tatsächlich zu Unifikationsalgorithmen führen, welche für Probleme in der Praxis einsetzbar sind.

Publisher: Shaker Verag, Germany

C.B. Tresp.

Beschreibungslogiken zur Behandlung von unscharfem Wissen im Kontext eines medizinischen Anwendungsszenarios

Abstract Im Anwendungsbereich Medizin wird der Einsatz traditioneller Wissensrepräsentationssysteme dadurch erschwert, dass Aussagen und Folgerungen häufig nicht exakt sind. Vielmehr liegen diese oft nur mit einem gewissen Grad an Unschärfe vor. Allerdings gibt es nichtsdestotrotz oft die Anforderung, das entsprechende Wissen geeignet zu formaliseren, um Anwendungssysteme zu entwickeln, die den Mediziner in seiner Ausbildung oder im Alltagsgeschäft unterstützen.

Zum Aufbau eines medizinischen Tutoringsystems im Bereich der Radiologie wird daher im Rahmen dieser Arbeit eine unendlichwertige (fuzzy) Beschreibungslogik entwickelt, welche speziell das in diesem Rahmen vorkommende unscharfe Wissen zu repräsentieren erlaubt.

Die Arbeit umfasst als einen Schwerpunkt die theoretischen Grundlagen der Beschreibungslogik und das darauf aufbauende Schlussfolgerungsverfahren, welches den Ausgangspunkt für ein Computersystem liefert. Über die an dieser Stelle gezeigten Eigenschaften lassen sich für die spätere Anwendung wichtige Aussagen treffen, welche beispielsweise die Korrektheit einer Inferenz in dieser Logik umfassen.

Zum anderen werden aber auch die praktischen Fragestellungen behandelt, die beim Entwurf eines medizinischen Anwendungssystems auftauchen. Ein wichtiger Punkt ist hier die Verarbeitung der medizinischen Fachsprache, die es im Kontext des Ausbildungssystems vollautomatisch in Konstrukte der Wissensrepräsentationssprache zu übersetzen gilt.

Publisher: Shaker Verlag, Germany


C.A. Albayrak.

Die WHILE-Hierarchie für Programmschemata

Abstract In der hier vorliegenden Arbeit werden verschiedene, aus der Literatur bekannte Klassen monadischer Programmschemata betrachtet. Monadische Programmschemata sind uninterpretierte Programme und stellen damit Programm über potentiell beliebigen Rechenbereichen mit beliebigen einstelligen Grundfunktionen dar. Im Gegensatz zum Äquivalenzbegriff für Programme, bei dem zwei Programme als äquivalent betrachtet werden, wenn sie die gleiche Funktion berechnen, fordert man beim Äquivalenzbegriff für Programmschemata, dass für alle Interpretationen der Grundfunktionen die entstehenden Programme die gleiche Funktion berechnen. Für die Klasse der Ianov-Schemata ist eine Charakterisierung äquivalenter Programmschemata bekannt. Hier wird eine solche Charakterisierung für die Klasse der Dijkstra-Schemata und die Klasse der Kosaraju-Schemata angegeben. Diese Charakterisierung erfolgt durch die Zuordnung von regulären Sprachen bzw. Vektoren von solchen Sprachen, wobei hervorzuheben ist, dass sich die charakterisierenden Sprachen am induktiven Aufbau der jeweiligen Klassen orientieren. Durch die Verwendung dieser Charakterisierungen kann beispielsweise das Äquivalenzproblem für Programmschemata dieser Klassen sehr effizient entschieden werden.

Darüber hinaus wird gezeigt, dass die Schachtelungstiefe von WHILE-Schleifen in Dijkstra-Schemata eine echte semantische Hierarchie von Funktionen, die sogenannte WHILE-Hierarchie, bildet. Dieses Hierarchieresultat zeigt, dass bei Betrachtung beliebiger Rechenbereiche und beliebigen Grundfunktionen der klassische Satz der Rekursionstheorie von S.C. Kleene nicht mehr gilt, der besagt, dass eine einzige WHILE-Schleife zur Programmierung beliebiger Funktionen auf natürlichen Zahlen ausreicht.

Publisher: Shaker Verlag, Germany

U. Sattler.

Terminological Knowledge Representation Systems in a Chemical Engineering Application

Abstract This work is concerned with the question of how far terminological knowledge representation systems can support the development of mathematical models of chemical processes. Terminological knowledge representation systems are based on Description Logics, a highly expressive formalism with well-defined semantics, and provide powerful inference services. These system services can be used to support the structuring of the application domain, namely the organised storage of parts of process models. However, the process systems engineering application asks for the extension of the expressive power of already existing Description Logics. These extensions are introduced and investigated with respect to the computational complexity of the corresponding inference problems.

Publisher: Verlag Mainz, Germany

Download (*.ps.gz, 450k)