The color coding means:

: all participate
: Doctoral students participate

Date Time Topic Speaker Room
2014 Apr 08   13:30-15:00 Composition Closure of Linear Extended Top-down Tree Transducers Andreas Maletti
(Universität Stuttgart)
Universität Leipzig P 501
2014 Apr 15 13:30-16:00 A Nivat Theorem for Weighted Timed Automata and Weighted Relative Distance Logic Vitaly Perevoshchikov Universität Leipzig P 501
Open problems session Claudia Carapelle
Shiguang Feng
Markus Teichmann
2014 Apr 22 13:30-15:00 The CFD Family of Description Logics  David Toman
(University of Waterloo)
TU Dresden INF E005
2014 Apr 29 13:30-15:00 Verification of Timed Pushdown Automata Karin Quaas
(Universität Leipzig)
Universität Leipzig P 501
2014 May 05-09 WATA 2014 Leipzig
2014 May 13 13:30-16:00 Weight Monitoring with Linear Temporal Logic Sascha Wunderlich TU Dresden INF E005
Weighted Synchronous Context-Free Tree Grammars and Pushdown Extended Tree Transducers Johannes Osterholzer
2014 May 20 13:30-15:00 Fragments of Presburger Arithmetic and Constraint Satisfaction Manuel Bodirsky
(École Polytechnique)
TU Dresden INF E005
2014 Jun 03 13:30-15:00 The FO(.) Knowledge Base System project: an integration project Marc Denecker
(Univ. Leuven)
Universität Leipzig P 501
2014 Jun 10 Pentecost
2014 Jun 17 13:30-16:00 A Compositional Algebra of Specifications Uli Fahrenberg
(IRISA Rennes)
Universität Leipzig P 501
On the Non-Monotonic Description Logic ALC+Tmin Oliver Fernández Gil
(Universität Leipzig)
2014 Jun 24 13:30-15:00 Non-Ontology-based Data Access: Schema agnostic Query Rewriting for OWL DL Markus Krötzsch
(TU Dresden)
TU Dresden INF E005
2014 Jul 08 13:30-15:00 Program equilibrium—a program reasoning approach Wiebe van der Hoek
(University of Liverpool)
Universität Leipzig P 501
2014 Jul 15 13:00-14:30 Novel Semantical Approaches to Relational Probabilistic Conditionals under Maximum Entropy Gabriele Kern-Isberner
(TU Dortmund)
TU Dresden INF E005
Starting time is 13:00
14:30-16:00 Open Access Publication Elena Di Rosa
(SLUB Dresden)
TU Dresden INF E005


Dr. rer. nat. Andreas Maletti: Composition Closure of Linear Extended Top-down Tree Transducers

The expressive power of compositions of linear extended top-down tree transducers with and without regular look-ahead is investigated. In particular, the restrictions of epsilon-freeness, strictness, and nondeletion are considered. The composition hierarchy is finite for all epsilon-free variants of these transducers except for epsilon-free nondeleting linear extended top-down tree transducers. The least number of transducers needed for the full expressive power of arbitrary compositions is presented. Together with sketches for all proofs, a new elegant proof technique is presented and illustrated.

Vitaly Perevoshchikov: A Nivat Theorem for Weighted Timed Automata and Weighted Relative Distance Logic

Weighted timed automata (WTA) model quantitative aspects of realtime systems like continuous consumption of memory, power or financial resources. They accept quantitative timed languages where every timed word is mapped to a value, e.g., a real number. In this paper, we prove a Nivat theorem for WTA which states that recognizable quantitative timed languages are exactly those which can be obtained from recognizable boolean timed languages with the help of several simple operations. We also introduce a weighted extension of relative distance logic developed by Wilke, and we show that our weighted relative distance logic and WTA are equally expressive. The proof of this result can be derived from our Nivat theorem and Wilke’s theorem for relative distance logic. Since the proof of our Nivat theorem is constructive, the translation process from logic to automata and vice versa is also constructive. This leads to decidability results for weighted relative distance logic.

Claudia Carapelle, Shiguang Feng, Markus Teichmann: Open problems session

An occasion for QuantLA students to present to their peers a problem on which they are currently working.

David Toman: The CFD Family of Description Logics

We consider the problem of answering conjunctive queries in the description logic CFDnc, a generalization of the logic CFD, a logic that is more suited to ontology based data answering (OBDA) than its predecessors. We show the problem retains PTIME data complexity and exhibit a procedure in which a relational engine can be usefully employed to address scalability issues for an ABox. In the talk we also briefly survey the issues with the past logics (exposing their difficulties in dealing with OBDA) and outline a novel reasoning technique that addressed these shortcomings for CFDnc. We also discuss the limits of this approach.

Karin Quaas: Verification of Timed Pushdown Automata

We study decidability of verification problems for timed automata extended with unbounded discrete data structures. More detailed, we extend timed automata with a pushdown stack. In this way, we obtain a strong model that may for instance be used to model real-time programs with procedure calls. It is long known that the reachability problem for this model is decidable. In this talk we identify subclasses of timed pushdown automata for which the language inclusion problem and related problems are decidable.

Sascha Wunderlich: Weight Monitoring with Linear Temporal Logic

Many important performance and reliability measures can be formalized as the accumulated values of weight functions. In this talk, we introduce an extension of linear time logic with new operators that impose constraints on the accumulated weight along path fragments. These fragments are characterized by regular conditions, which can be formalized by deterministic finite automata (monitoring DFA).
This new logic covers properties expressible by several recently proposed formalisms. We show these relations, give examples for properties expressible in this logic, and study the model-checking problem for weighted Markov decision processes with rational weights. While the general problem is undecidable, we provide sharp complexity bounds for several decidable sublogics that arise by restricting the monitoring DFA.

Johannes Osterholzer: Weighted Synchronous Context-Free Tree Grammars and Pushdown Extended Tree Transducers

In natural language processing, one often employs a weighted grammar G to account for the structure of a human language. In doing so, a parse tree ξ of a sentence s under G represents a linguistic analysis of s. We may obtain a foreign-language translation of s by applying a weighted tree transformation τ to ξ. This approach is called syntax-based machine translation (SMT). There is good evidence that human language exhibits phenomena that can only be captured by mildly context-sensitive formalisms, whose parse trees form weighted tree languages definable by weighted context-free tree grammars (wcftg). These wcftg allow second-order substitution, i.e., the substitution of a symbol within a tree by a tree with variables. Second-order substitution also promises an improvement in the quality of the tree transformations applied in SMT. Among the proposed transformation formalisms which make use of it are weighted synchronous context-free tree grammars (wscftg).
A wscftg is the ``direct product'' of two wcftg, and allows the synchronous derivation of an input and an output tree of the grammar's tree transformation. We investigate simple wscftg, a syntactic restriction of wscftg. Moreover, we introduce weighted pushdown extended top-down tree transducers (wpxtop). They are extended top-down tree transducers where the finite state control is equipped with a tree pushdown. In a derivation step, a wpxtop can (i) match the input tree with a context of arbitrary height, (ii) inspect the root symbol of the tree pushdown and (iii) push further symbols onto the tree pushdowns that control the rewriting of the remaining input subtrees. It is proven that the transformations of linear and nondeleting wpxtop are exactly those of simple wscftg. We note that this is in fact the generalization of a classical result from strings to weighted tree languages, viz. the characterization of simple syntax-directed translations by pushdown transducers of Lewis and Stearns.

Dr. Manuel Bodirsky: Fragments of Presburger Arithmetic and Constraint Satisfaction

Similarly as in automata theory, we distinguish between qualitative and quantitative constraint formalisms. Qualitative constraint formalisms have been studied intensively in spatial and temporal reasoning in Artificial Intelligence, and many tools from universal algebra can be used to study their computational complexity, in the same way as they are used to study finite domain constraint satisfaction problems. Quantitative constraint formalisms are harder to study from a complexity point of view, but they naturally appear in applications where some form of numeric reasoning is needed.
In this talk, we focus on a fundamental class of quantitative constraint satisfaction problems, namely on the CSPs for constraint languages that are first-order definable over (Z,+,<), that is, CSPs for fragments of Presburger arithmetic (with order). I will present partial complexity classification results, and many open problems related to a full classification of CSPs for fragments of Presburger arithmetic, ranging from concrete algorithmic problems to open questions in universal algebra and model theory.

Marc Denecker: The FO(.) Knowledge Base System project: an integration project

FO(.) is used here as a generic term to denote extensions of classical first order logic FO. On the logical level, the goal of this project is to achieve a conceptually clean -non-hybrid- integration of logic constructs from different computational logic paradigms (FO, logic programming extensions such as datalog, Abductive Logic Programming, Answer Set Programming, fixpoints logics, constraint programming) in the context of classical logic. On the computational level, the project aims to integrate and extend technologies developed in the respective fields to build a Knowledge Base System that supports various forms of inference. I will explain motivations, principles and new research questions raised by such a project. I will give an overview of the current system and some applications. An application for interactive configuration will serve to highlight a principle that separates declarative modelling languages from (procedural or declarative) programming languages: the reuse of a modelling to solve different computational tasks by applying different forms of inference.

Uli Fahrenberg: A Compositional Algebra of Specifications

Logical or structural specifications are a popular device for specifying expected behavior of systems. To be useful also for large systems, it is important that a specification formalism be compositional, so that systema and specifications can be combined in an iterative development process. I will outline recent work on how such a compositional algebra of specifications can look like, and I will try to hint how such an algebra may be generalized to quantitative specifications.

Oliver Fernández Gil: On the Non-Monotonic Description Logic ALC+Tmin

In the last 20 years many proposals have been made to incorporate non-monotonic reasoning into description logics, ranging from approaches based on default logic and circumscription to those based on preferential semantics. In particular, the non-monotonic description logic ALC+Tmin uses a combination of the preferential semantics with minimization of a certain kind of concepts, which represent atypical instances of a class of elements. One of its drawbacks is that it suffers from the problem known as the property blocking inheritance, which can be seen as a weakness from an inferential point of view. In this talk we present an extension of ALC+Tmin, namely ALC+T+min, with the purpose to solve the mentioned problem. In addition, we show the close connection that exists between ALC+T+min and concept-circumscribed knowledge bases. Finally, we study the complexity of deciding the classical reasoning tasks in ALC+T+min.

Markus Krötzsch: Non-Ontology-based Data Access: Schema agnostic Query Rewriting for OWL DL

SPARQL 1.1 supports the use of ontologies to enrich query results with logical entailments, and OWL 2 provides a dedicated fragment OWL QL for this purpose. Typical implementations use the OWL QL schema to rewrite a conjunctive query into an equivalent set of queries, to be answered against the non-schema part of the data. With the adoption of the recent SPARQL 1.1 standard, however, RDF databases are capable of answering much more expressive queries directly, and we ask how this can be exploited in query rewriting. We find that SPARQL 1.1 is powerful enough to “implement” a full-fledged OWL QL reasoner in a single query. Using additional SPARQL 1.1 features, we develop a new method of schema-agnostic query rewriting, where arbitrary conjunctive queries over OWL QL are rewritten into equivalent SPARQL 1.1 queries in a way that is fully independent of the actual schema. This allows us to query RDF data under OWL QL entailment without extracting or preprocessing OWL axioms.
This is joint work with Stefan Bischof, Axel Polleres, and Sebastian Rudolph.

Wiebe van der Hoek: Program equilibrium—a program reasoning approach

The concept of program equilibrium, introduced by Howard (Theory and Decision 3, 1988) and further formalised by Tennenholtz (Games and Economic Behaviour, 2004), represents one of the most ingenious and potentially far-reaching applications of ideas from computer science in game theory to date. The basic idea is that a player in a game selects a strategy by entering a program, whose behaviour may be conditioned on the programs submitted by other players. Thus, for example, in the prisoner’s dilemma, a player can enter a program that says “If his program is the same as mine, then I cooperate, otherwise I defect”. It can easily be shown that if such programs are permitted, then rational cooperation is possible even in the one-shot prisoner’s dilemma. In the original proposal of Tennenholtz, comparison between programs was limited to syntactic comparison of program texts. While this approach has some considerable advantages (not the least being computational and semantic simplicity), it also has some important limitations.
In the talk, we investigate an approach to program equilibrium in which richer conditions are allowed, based on model checking—one of the most successful approaches to reasoning about programs. We introduce a decision-tree model of strategies, which may be conditioned on strategies of others. We then formulate and investigate a notion of “outcome” for our setting, and investigate the complexity of reasoning about outcomes. We focus on coherent outcomes: outcomes in which every decision by every player is justified by the conditions in his program. We identify a condition under which there exist a unique coherent outcome. We also compare our notion of (coherent) outcome with that of (supported) semantics known from logic programming. We illustrate our approach with many examples.

Gabriele Kern-Isberner Novel Semantical Approaches to Relational Probabilistic Conditionals under Maximum Entropy

Reasoning on maximum entropy is well-known for its excellent (formal and commonsensical) properties, and the same holds for its extension to belief change by the principle of minimum cross-entropy. However, the optimum entropy methods have been considered principally in propositional frameworks, maybe mostly due to the fact that there is no clear semantics for interpreting first-order probabilistic (conditional) sentences. It seems to be a common understanding that either a statistical approach that counts (tuples of) individuals has to be used to interpret such sentences, or the knowledge base has to be grounded to make a possible world semantics applicable, for a subjective interpretation of probabilities.

In this talk, after recalling the principles of optimum entropy and the benefits they bring to uncertain reasoning, I propose novel semantical perspectives on first-order (or relational) probabilistic conditionals that are motivated by considering them as subjective, but population-based statements. Two different semantics for relational probabilistic conditionals are presented to which the maximum entropy principle can be applied, yielding quite well-behaved inference operators. This will be evaluated according to a set of postulates for suitable inference operators in this framework.

This is joint work with Matthias Thimm (Universität Koblenz-Landau)