Year: 2024, 2023, 2022, 2021, 2020, 2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012
2019
 2019Nov22 Talk
APB 3027, TU Dresden, 10:00
 Title: Efficiently Solving Unbounded Integer Programs in the context of SMT Solvers

Martin Bromberger, MaxPlanckInstitut für Informatik, Saarbrücken

Abstract:Satisfiability modulo theories (SMT) solvers are automated theorem provers for logical formulas that range over combinations of various firstorder theories. These theories typically correspond to domains found in programming languages, e.g., the theories of bit vectors, integers, and arrays. This is intentional becauseSMT solvers were initially developed as backend reasoning tools for automated software verification. These days, SMT solvers are also used as backend reasoning tools for various other applications, e.g., verification of hybrid systems, program synthesis, and as bruteforce tactics in various interactive theorem provers.
In this talk, I will present two new techniques for the theory of linear integer arithmetic in the context of SMTsolvers:
1) The unit cube test [2,3], a sound (although incomplete) test that finds solutions for integer programs(i.e., systems of linear inequalities) in polynomial time. The test is especially efficient on absolutely unbounded integer programs, which are difficult to handle for many other decision procedures.
2) A bounding transformation [1] that reduces any integerprogram in polynomial time to an equisatisfiableinteger program that is bounded. The transformation is beneficial because it turns branch and bound into a complete and efficient decision procedure for integer programs.
**[1] *A Reduction from Unbounded Linear Mixed Arithmetic Problems into Bounded Problems*, Martin Bromberger. /IJCAR 2018/, volume 10900 of LNCS, pages 329–345. Springer, 2018.*
***[2] *New Techniques for Linear Arithmetic: Cubes and Equalities*, Martin Bromberger, and Christoph Weidenbach. In /FMSD /volume 51(3), pages 433–461. Springer, 2017.
*[3] Fast Cube Tests for LIA Constraint Solving*, Martin Bromberger, and Christoph Weidenbach. In /IJCAR 2016/, volume 9706 of LNCS, pages 116–132. Springer, 2016.
APB 3027, TU Dresden, 10:00
 Title: Rational closure for all description logics

Abstract:
Nonmonotonic logics constitute an appealing target format for quantitative assertions. The link between nonmonotonic reasoning and probabilities has been extensively studied, and it would be interesting to encode with nonmonotonic assertions the output of machine learning algorithms, that is bound to be erroneous to some extent. The retraction and conflict management capabilitites of nonmonotonic logics can potentially deal with such errors. The advantage, in the long run, would be learning structured knowledge whose consequences can be explained (a sorely missed feature in modern machine learning). The nonmonotonic semantics used for such purposes should be fully general and widely agreed upon. Currently, generality is an issue for the logics based on the postulates by Kraus, Lehmann and Magidor (KLM), which require the disjoint union model property. We argue that this is due to a "bug" in the definition of rational closure, and present an amended definition that can be applied to unrestricted description logics. We conclude by discussing whether this is good or bad news for KLMbased approaches.
APB 3027, TU Dresden, 10:00
 Title: Crispdeterminzation of weighted tree automata over strong bimonoids

Abstract:
Initial algebra semantics and run semantics for weighted tree automata (wta) over strong bimonoids do not coincide in general. We consider the problems whether, for a give wta,
(i) a crispdeterministic wta can be constructed which is equivalent to the original wta with respect to the initial algebra semantics and
(ii) the same problem except that the initial algebra semantics is replaced by run semantics.
In both cases, we give sufficient conditions for the existence of such a crispdeterministic wta. Also, we deal with the problem whether it is decidable that such a crispdeterministic wta exists. Most of our results are generalizations of the corresponding ones for weighted string automata over strong bimonoids in the papers:
M. Droste, T. Stüber, and H. Vogler. Weighted finite automata over strong bimonoids.
Inform. Sci., 180(1):156166, 2010.
M. Ciric, M. Droste, J. Ignjatovic, and H. Vogler. Determinization of weighted finite
automata over strong bimonoids. Inform. Sci., 180(18):34793520, 2010.
 Title: Efficiently Solving Unbounded Integer Programs in the context of SMT Solvers
 2019Sep05 Talk
Hörsaal 6, Hörsaalgebäude, Universitätsstraße 3, Leipzig, 16:15
 Title: What's Quantum Computing Anyway?

Abstract:
link (pdf)
 Title: What's Quantum Computing Anyway?
 2019Sep04 Talk
P501, Uni Leipzig, 14:15
 Title: LOGIC in computer science and software industry
(and in mathematics, time permitting) 
Abstract:
In software industry, engineers do formal logic day in and day out, even though they may not realize that. As a rule, they have not studied logic. Instead, they studied calculus which they use rarely, if ever. I will try to illustrate why logic is relevant for software industry and why it is hard for software engineers to pick it up.
(The story of logic in math departments is very different, peculiar and uncanny.)
 Title: LOGIC in computer science and software industry
 2019Jul23 Talk
APB 3027, TU Dresden, 13:00  14:30
 Title: Extending EL++ with Linear Constraints on the Probability of Axioms

Abstract:
One of the main reasons to employ a description logic such as EL++ is the fact that it has efficient, polynomialtime algorithmic properties such as deciding consistency and inferring subsumption. However, simply by adding negation of concepts to it, we obtain the expressivity of description logics whose decision procedure is ExpTimecomplete. Similar complexity explosion occurs if we add probability assignments on concepts. To lower the resulting complexity, we instead concentrate on assigning probabilities to Axioms/GCIs. We show that the consistency detection problem for such a probabilistic description logic is NPcomplete, and present a linear algebraic deterministic algorithm to solve it, using the column generation technique. We also examine algorithms for the probabilistic extension problem, which consists of inferring the minimum and maximum probabilities for a new axiom, given a consistent probabilistic knowledge base.
Future work aims at finding fragments of probabilistic EL++ which are tractable.
 Title: Extending EL++ with Linear Constraints on the Probability of Axioms
 2019May09 Talk
APB 3027, TU Dresden, 13:00  14:30
 Title: Quine's Fluted Fragment

Ian PrattHartmann, University of Manchester

Abstract:
We consider the fluted fragment, a decidable fragment of firstorder logic with an unbounded number of variables, originally identified in 1968 by W. V. Quine. We show that the satisfiability problem for this fragment has nonelementary complexity, thus refuting an earlier published claim by W.C. Purdy that it is in NExpTime. More precisely, we consider the intersection of the fluted fragment and the mvariable fragment of firstorder logic, for all nonnegative m. We obtain upper and lower complexity bounds for this fragment that coincide for all m up to the value 4.
 Title: Quine's Fluted Fragment
 2019Apr25 Talk
APB 3027, TU Dresden, 13:00  14:30
 Title: Data Science use cases for Lifestyle Banking

Boontawee (Meng) Suntisrivaraporn, Siam Commercial Bank

Abstract:
Traditional banking concerns itself with risk understanding, credit underwriting, cash need, liquidity, etc. Data science and machine learning have proved useful in banking businesses to mitigate risk while targeting the right customers with cash need. In this talk, we will explore a lifestyle side of banking that goes beyond the traditional realm and delves more into alternative signals to customers needs. We will see example data science use cases that have been successfully implemented in banking business at Siam Commercial Bank.
 Title: Data Science use cases for Lifestyle Banking
 2019Apr10 Talk
P501, Universität Leipzig, 15:00  16:30
 Title: Endliche Automaten über Conway Halbringen

Abstract:
Wir stellen eine allgemeine und in vielen Situationen anwendbare Version des Satzes von Kleene für Conway Halbringe vor.
 Title: Endliche Automaten über Conway Halbringen
 2019Jan24 Talk
P801, Universität Leipzig, 13:30  15:00
 Title: Operation on Arrays

Abstract:
This talk aims to introduce a few types of automata and grammar models and gives an overview on the class of languages recognized by them in the twodimensional world.
A twodimensional string (or a picture) is a twodimensional rectangular array of symbols taken from a finite alphabet.
A twodimensional language (or picture language) is a set of pictures.
2018
 20180228 Talk
P501, Universität Leipzig, 11:00  13:00
 Title: Eilenberg Theorems for Free

Abstract:
Eilenbergtype correspondences, relating varieties of languages (e.g., of finite words, infinite words, or trees) to pseudovarieties of finite algebras, form the backbone of algebraic language theory. We show that they all arise from the same recipe: one models languages and the algebras recognizing them by monads on an algebraic category, and applies a Stonetype duality. Our main contribution is a variety theorem that covers e.g. Wilke's and Pin's work on ∞languages, the variety theorem for cost functions of Daviaud, Kuperberg, and Pin, and unifies the two categorical approaches of Bojańczyk and of Adámek et al. In addition we derive new results, such as an extension of the local variety theorem of Gehrke, Grigorieff, and Pin from finite to infinite words.
2017
 20171130 Talk
APB/3027, TU Dresden, 09:20  10:50
 Title: Generalized Consistent Query Answering under Existential Rules

Abstract:
In the talk, I give an overview of recent results on generalised consistent query answering under existential rules. Previous work has proposed consistent query answering as a way to resolve inconsistencies in ontologies. In these approaches to consistent query answering, however, only inconsistencies due to errors in the underlying database are considered. In recent works, we additionally assume that ontological axioms may be erroneous, and that some database atoms and ontological axioms may not be removed to resolve inconsistencies. This problem is especially well suited in debugging mappings between distributed ontologies. We define two different semantics, one where ontological axioms as a whole are ignored to resolve an inconsistency, and one where only some of their instances are ignored. We then give a precise picture of the complexity of consistent query answering under these two semantics when ontological axioms are encoded as different classes of existential rules.
 20171208 Talk
APB/3027, TU Dresden, 10:30  11:30
 Title: Interpolation, Amalgamation and Combination

Abstract:
We study the conditions under which existence of interpolants (for quantifierfree formulae) is modular, in the sense that it can be transferred from two firstorder theories T1 , T2 to their combination T1 ∪ T2 . As an interesting application, we relate the combinability criterion we obtain for theories sharing a common Horn subtheory to superamalgamability conditions known from propositional logic and we use this fact to derive results concerning fusions transfer of interpolation properties in modal logic.This is joint work with A. Gianola.
 20171208 Talk
APB/3027, TU Dresden, 09:00  10:00
 Title: Horn Rewritability vs PTime Query Evaluation for Description Logic TBoxes

Abstract:
TBA
 20171103 Talk
APB/E009, TU Dresden, 13:00  14:30
 Title: Automatic Behavior Composition

Sebastian Sardina, RMIT University (Melbourne, Australia)

Abstract:
With computers now present in everyday devices like mobile phones, credit cards, cars and planes or places like homes, offices and factories, the trend is to build embedded complex systems from a collection of simple components. For example, a complex smart house system can be ``realised'' (i.e., implemented) by suitably coordinating the behaviours (i.e., the operational logic) of hundreds (or thousands) of simple devices and artifacts installed in the house, such as lights, blinds, game consoles, a vacuum cleaner, video cameras, audio systems, TVs, a floor cleaning robot, etc.The problem of automatically synthesising, that is, building, an effective coordinator controller for a given desired target system is called the behaviour composition problem and is the focus of this talk. The composition task can be recast in a variety of domains, including robot ecologies, ubiquitous robots or intelligent spaces, webservices, componentbased development (CBD), agent programming, and automated planning.
The talk will cover the standard formalization of the problem, several extensions proposed, and the main computational techniques proposed. Importantly, the behavior composition problem draws from various areas of Artificial Intelligence and Computer Science, including verification, reasoning about action, webservices, and generalized planning.
A main reference for this work is the following one:
Giuseppe De Giacomo, Fabio Patrizi, Sebastian Sardiña: Automatic behavior composition synthesis. Artificial Intelligence 196: 106142 (2013)This is joint work done with Prof. Giuseppe De Giacomo and Dr. Fabio Patrizi (Sapienza University, Italy) and Dr. Nitin Yadav (Melbourne University, Australia).
 20170419 Talk
P501, Universität Leipzig, 15:00  16:30
 Title: Formal methods for the verification of distributed algorithms

Abstract:
We introduce an automatatheoretic method for the verification of distributed algorithms running on ring networks. In a distributed algorithm, an arbitrary number of processes cooperate to achieve a common goal (e.g., elect a leader, sort an array). Processes have unique identifiers (pids) from an infinite, totally ordered domain. An algorithm proceeds in synchronous rounds, each round allowing a process to perform a bounded sequence of actions such as send or receive a pid, store it in some register, and compare register contents wrt. the associated total order. An algorithm is supposed to be correct independently of the number of processes. To specify correctness properties, we introduce a logic that can reason about processes and pids. Referring to leader election, it may say that, at the end of an execution, each process stores the maximum pid in some dedicated register.We show that the verification problem of distributed algorithms can be reduced to satisfiability of a formula from propositional dynamic logic with loop and converse (LCPDL), interpreted over cylinders over a finite alphabet. This translation is independent of any restriction imposed on the algorithm. However, since the verification problem (and satisfiability for LCPDL) is undecidable, we propose an underapproximation technique, which bounds the number of rounds. This is an appealing approach, as the number of rounds needed by a distributed algorithm to conclude is often exponentially smaller than the number of processes. Using our reduction to LCPDL, we provide an automatatheoretic solution, reducing model checking to emptiness for alternating twoway automata on words. Overall, we show that roundbounded verification of distributed algorithms over rings is PSPACEcomplete, provided the number of rounds is given in unary.
 20170109 Talk
P801, Universität Leipzig, 13:00  14:00
 Title: Average motif counts in Markov chains of contextfree graph transformation systems

Abstract:
The dynamics of the average molecule count of a chemical species in a test tube over time is the paradigm example for motif counts in continuous time systems with discrete state space. More generally, we can consider average counts of "observable" graph motifs in Markov processes that are specified by graph transformation systems (GTS). Such average counts can be used to measure network flows, protein production in cells, and the speed of DNAwalkers.Computing average motif counts is typically extremely complicated if not impossible as it often amounts to solving the master equation of the underlying Markov chain with countable state space. The case is much different for the graph transformation analogue of contextfree Chomsky grammars, namely contextfree GTSs, aka hyperedge replacement systems. For these systems, it is always possible to give for each motif a finite ordinary differential equation that captures the time evolution of the mean count of the motif precisely. To illustrate the analysis technique, the talk uses the preferential attachment process as example, a popular model in the context of citation networks.
2016
 20160622 Talk
P502, Universität Leipzig, 15:00  16:00
 Title: Parallel parsing of Operator Precedence languages

Abstract:
Operator precedence grammars, introduced by Floyd several decades ago, enjoy properties that make them attractive to face problems and to exploit technologies highly relevant in these days. An important instance is the local parsability property, i.e, the fact that any substring of a longer one can be parsed independently of its context without the risk of invalidating the partial parsing when analyzing another portion of the whole string. We exploit this distinguishing property by developing parallel algorithms and a tool, and suggest its further application to incremental analysis.
 20160615 Talk
APB/3027, TU Dresden, 10:00
 Title: Mining DL ontologies

Ulrike Sattler, University of Manchester, School of Computer Science

Abstract:
In a variety of domains and for a variety of purposes, people are using Description Logics (DL) ontologies: depending on how we count, there are hundreds or thousands of them. Quite often, these ontologies have a rather lightweight class hierarchy, and come with data in the form of an ABox or are used together in applications with external data. In the presence of this data, a naturally arising question is whether we can’t use this data to mine or learn general concept inclusion axioms, i.e., TBox axioms? And if yes, how can we do this, and how complex can these axioms be? In my talk, I will discuss these questions and describe our approach, which has been implemented in DL Miner. We aim at identifying hypotheses that capture interesting correlations in the data in the ontology. Our approach can be said to be logically and statistically sound: it works with DL’s open world assumption, and identifies suitable hypotheses by regarding this as a multidimensional optimisation problem (e.g., we are after hypotheses that are maximally supported and minimally brave). I will explain the setting for this problem, our approach, and our experiences with DL Miner. I will assume that the audience has a basic understanding of DLs, but this talk should also be of interest to anybody with some understanding of First Order Logic. This is joint work with Slava (Viachaslau) Sazonau.
 20160610 Talk
APB/1004, TU Dresden, 13:30  14:30
 Title: Structural Subsumption Based Similarity Measures: applications with a medical ontology and in image object recognition

Boontawee Suntisrivaraporn, Sirindhorn International Institute of Technology (SIIT)

Abstract:
Description Logics (DLs) are a family of logicbased knowledge representation formalisms, which can be used to develop ontologies in a formally wellfounded way. The standard reasoning service of subsumption has proved indispensable in ontology design and maintenance. This checks, relative to the logical definitions in the ontology, whether one concept is more general/specific than another. When no subsumption relationship is identified, however, no information about the two concepts can be given. We introduce a similarity measure based on the structural subsumption algorithm for the DL EL and then generalize this to one for the DL ALEH. The proposed similarity measures compute a numerical degree of similarity between two concept descriptions despite not being in the subsumption relation. In this talk, two application scenarios will be discussed: the first is about new findings in SNOMED CT; whereas the second is the employment of the similarity measure in an image object recognition framework.
 20160608 Talk
P502, Universität Leipzig, 13:30  15:00
 Title: Weighted computational models

Abstract:
We present weighted computational models like automata over words and trees, and contextfree grammars. We show that properties of their behaviors are closely related to the properties of the underlying weight structure. We list several open problems, especially arising in practical applications.
 20160512 Talk
APB/3027, TU Dresden, 13:30  14:30
 Title: Inconsistency Handling in OntologyMediated Query Answering

Camille Bourgaux, Laboratoire de Recherche en Informatique at Université ParisSud, France

Abstract:
The problem of querying description logic knowledge bases using conjunctive queries has been a major focus of recent description logic research. An important issue that arises in this context is how to handle the case in which the data is inconsistent with the ontology. Inconsistencytolerant semantics can be used to obtain meaningful answers. The semantics AR, IAR, and brave are three wellknown semantics, that rely on the notion of a repair, that is an inclusionmaximal subset of the data consistent with the ontology. I will present work on dealing with inconsistent DLLite knowledge bases, using these semantics. Beyond efficient query answering, we address the problems of explaining the query results, and of querydriven repairing of the data. We also consider variants of these semantics based on preferred repairs.
2015
 20151112 Talk
APB/3027, TU Dresden, 13:30
 Title: Language Equations

Alexander Okhotin, Ph.D., Department of Mathematics and Statistics, University of Turku, Finland

Abstract:
Equations with formal languages as unknowns appear in many areas where sets of strings are used. Language equations with onesided concatenation naturally represent finite automata, and have been used as a tool for deciding various logics. Equations involving concatenation of variables are fundamental for the theory of formal grammars; in particular, the FO(LFP) logic that subsumes all kinds of grammars can be regarded as a fargoing generalization of language equations. Some further types of equations can represent computationally universal setsand beyond.For each kind of language equations, the typical research questions are assessing their expressive power (what languages can be represented, e.g., as a unique solution of some equation?) and determining the complexity of their decision problems (e.g., how hard is to test whether a given equation has any solutions?). In the present talk, results of this kind for various types of language equations shall be surveyed and compared to each other.
 20150903 Talk
Room t.b.a., Leipzig, 13:15  15:00
 Title: Uniform Eilenberg Theorems: algebraic automata theory based on categories

PD Dr. Stefan Milius, FriedrichAlexanderUniversität ErlangenNürnberg

Abstract:
Eilenberg's variety theorem, a centerpiece of algebraic automata theory, establishes a bijective correspondence between varieties of languages and pseudovarieties of monoids. In this talk I will show how to generalize this result from sets and boolean algebras to an abstract pair of algebraic categories. We introduce varieties of languages in a category C, and prove that they correspond to pseudovarieties of monoids in a category D that is a locally finite commutative variety, provided that C and D are dual on the level of finite objects. By suitable choices of these categories our result uniformly covers Eilenberg's theorem and three variants due to Pin, Pol?k and Reutenauer, respectively, and yields new Eilenbergtype correspondences. I will also show that the syntactic monoid of a language is generalized to the level of the category D. This allows for a uniform treatment of several notions of syntactic algebras known in the literature, including the syntactic monoids of Rabin and Scott (D = sets), the syntactic semirings of Pol?k (D = semilattices), and the syntactic associative algebras of Reutenauer (D = vector spaces).
 20150729 Talk
INF 1004, TU Dresden, 13:30
 Title: Algorithms for Computing a Comprehensive Groebner Basis of a Parametric Polynomial System

Deepak Kapur, The University of New Mexico, Albuquerque, NM, USA

Abstract:
A parametric polynomial system is a finite set of polynomials equations in which variables are classified into parameters and indeterminates. Many engineering and scientific problems can be modeled using parametric polynomial systems in which the same problem has to be analyzed for different values of parameters. Some applications include geometry theorem proving and discovery, image understanding and geometric modeling, and more recently program analysis using quantifier elimination. In 1992, Weispfenning introduced a concept of a comprehensive Groebner basis (CGB) and comprehensive Groebner system (CGS) of a parametric polynomial system which generalized Groebner basis proposed by Buchberger. A related concept, called parametric Groebner basis was proposed by Kapur in 1994, motivated by the application of image understanding. The talk will discuss recently developed efficient algorithms for computing a CGS and CGB of a parametric polynomial system. The algorithms have been implemented and successfully tried on many examples from the literature. Some open problems on this topic will also be proposed.
This research is collaboration with Profs. Wang and Sun of the Chinese Academy of Sciences and my Ph.D. student Yiming Yang of UNM.
 20150623 Seminar Lecture
Felix KleinHörsaal P 501, Leipzig, 13:30  15:00
 Title: Matrix Interpretations on Polyhedral Domains

Abstract:
We refine the method of matrix interpretations for proving termination and complexity bounds of term rewrite systems: we restrict the interpretation domain by a system of linear inequalities. Admissibility of a restriction is shown by certificates whose validity can be expressed as a constraint program. This refinement is orthogonal to other features of matrix interpretations (complexity bounds, dependency pairs) but it can be used to improve complexity bounds, and we discuss its relation with the usable rules criterion. We present an implementation, and results of experiments.
 20150513 Talk
TU Dresden, APB 0005, 10:00
 Title: Learning in Fuzzy Description Logics

Abstract:
Incompleteness and vagueness are inherent properties of knowledge in several realworld domains and are particularly pervading in those domains where entities could be better described in natural language. The issues raised by incomplete and vague knowledge have been traditionally addressed in the field of Knowledge Representation within the stream of research devoted to fuzzy Description Logics (DLs). However, the problem of automatically managing the evolution of fuzzy DL ontologies still remains relatively unaddressed. In this talk, I present a method for inducing fuzzy GCI axioms from any crisp DL knowledge base. The method has been implemented and applied on realworld data in the tourism domain.
 20150511 Talk
TU Dresden, APB 3105, 14:50
 Title: Modeling Concept Learning Problems with SecondOrder Description Logics

Abstract:
Among the inferences studied in Description Logics (DLs), induction has been paid increasing attention over the last decade. Indeed, useful nonstandard reasoning tasks can be based on the inductive inference. Among them, Concept Learning is about the automated induction of a description for a given concept starting from classified instances of the concept. In this talk I present a formal characterization of Concept Learning in DLs which relies on recent results in Knowledge Representation and Machine Learning. Based on secondorder DLs, it allows for modeling Concept Learning problems as constructive DL reasoning tasks where the construction of the solution to the problem may be subject to optimality criteria.
 20150504 Talk
Leipzig University, room P801, 15:00
 Title: Distances for Quantitative Verification

Abstract:
It has long been recognized in the verification community that for some applications, Boolean answers to verification problems do not suffice. Sometimes one would like to count things; sometimes one wants to verify models where quantitative information is essential and one needs the verification result to be robust under small quantitative changes; sometimes both the models and the properties to be verified are quantitative and one wants to know how far away the model is from satisfying the property. With the everincreasing ubiquity of embedded systems, these types of quantitative verification are becoming more and more important.
Different applications foster different types of quantitative verification, but it turns out that most of these essentially measure some type of distances between labeled transition systems. Starting in 2010, and together with several coauthors, I have developed a unifying framework which allows one to reason about such distancebased quantitative verification independently of the precise distance. This is essentially a metric theory of labeled transition systems, with infinite quantitative games as its main theoretical ingredient and general fixedpoint equations for linear and branching distances as one of its main results. We have successfully applied our framework in specification theories and more broadly in software engineering and statistical language processing.
 20150317 Talk
Universität Leipzig, Felix KleinHörsaal P501, 15:15  17:00
 Title: Decision Problems in Group Theory

Abstract:
It will be shown how certain grouptheoretic constructions are used to study natural algorithmic questions. The BooneHigman algebraic characterisation of precisely when a finitely generated group has soluble word problem will be discussed. All necessary algebraic and decisiontheoretic background will be provided to make the talk generally accessible.
2014
 20141104 Talk
Technische Universität Dresden, INF E005, 13:30
 Title: Containment of Monadic Datalog

Pierre Bourhis (CNRS)

Abstract:
We reconsider the problem of containment of monadic datalog (MDL) queries in unions of conjunctive queries (UCQs). Prior work has dealt with special cases, but has left the precise complexity characterization open. We begin by establishing a 2EXPTIME lower bound on the MDL/UCQ containment problem, resolving an open problem from the early 90’s. We then present a general approach for getting tighter bounds on the complexity, based on analysis of the number of mappings of queries into treelike instances. We use the machinery to present an important case of the MDL/UCQ containment problem that is in coNEXPTIME, and a case that is in EXPTIME. We then show that the technique can be used to get a new tight upper bound for containment of tree automata in UCQs. We show that the new MDL/UCQ upper bounds are tight.
 20140630 Talk
Technische Universität Dresden, INF/3027, 16:00
 Title: OntologyBased Data Access and NonUniform Constraint Satisfaction

Abstract:
Ontologybased data access is concerned with querying incomplete data sources in the presence of domainspecific knowledge provided by an ontology. A central notion in this setting is that of an ontologymediated query, which is a database query coupled with an ontology. In this talk, we discuss several classes of ontologymediated queries, where the database queries are given as some form of conjunctive query and the ontologies are formulated in description logics or other relevant fragments of firstorder logic, such as the guarded fragment and the unarynegation fragment.
We then establish intimate connections between ontologymediated queries and constraint satisfaction problems (CSPs) and their logical generalization, MMSNP formulas. These connections are exploited to obtain results regarding firstorder and datalogrewritability of ontologymediated queries and P/NP dichotomies for ontologymediated queries.
 20140626Talk
Universität Leipzig, Room A 531, 15:00

Title: Kleene's theorem in partial Conway theories with an application to weighted tree automata

Abstract:
Each finite automaton may be seen as a system of fixed point equations and can thus be regarded as a morphism of a category enriched with a fixed point operation. The behavior of the automaton can be specified as another morphism of the same category. An advantage of this approach is that the same formalism is used for both the syntax and the semantics of automata. And when the fixed point operation satisfies a few simple identities, we can prove a generic Kleene's theorem applicable in several contexts including weighted tree automata.

 20140523 PhD defense
Technische Universität Dresden, INF/E005, 13:15

Title: Fuzzy Description Logics with General Concept Inclusions

Abstract:
Description logics (DLs) are used to represent knowledge of an application domain and provide standard reasoning services to infer consequences of this knowledge. However, classical DLs are not suited to represent vagueness in the description of the knowledge. We consider a combination of DLs and Fuzzy Logics to address this task. In particular, we consider the tnormbased semantics for fuzzy DLs introduced by Hájek in 2005. Since then, many tableau algorithms have been developed for reasoning in fuzzy DLs. Another popular approach is to reduce fuzzy ontologies to classical ones and use existing highly optimized classical reasoners to deal with them. However, a systematic study of the computational complexity of the different reasoning problems is so far missing from the literature on fuzzy DLs. Recently, some of the developed tableau algorithms have been shown to be incorrect in the presence of general concept inclusion axioms (GCIs). In some fuzzy DLs, reasoning with GCIs has even turned out to be undecidable. This work provides a rigorous analysis of the boundary between decidable and undecidable reasoning problems in tnormbased fuzzy DLs, in particular for GCIs. Existing undecidability proofs are extended to cover large classes of fuzzy DLs, and decidability is shown for most of the remaining logics considered here. Additionally, the computational complexity of reasoning in fuzzy DLs with semantics based on finite lattices is analyzed. For most decidability results, tight complexity bounds can be derived.

 20140522 Talk
Technische Universität Dresden, INF/3027, 15:00

Title: Beyond DLLite: Big Data meets Heavier Ontologies

Abstract:
The attractive features of DLLite are well known: data can be left in legacy/scalable data stores, and data complexity is low (AC0). However, these benefits come at the cost of a severely constrained ontology language. Many applications seem to require more expressive ontologies  certainly many existing ontologies do not satisfy the relevant constraints. In this talk I will survey techniques for (empirically) scalable query answering in cases where more expressive ontologies are used, focussing on recent work on enhanced materialisationbased techniques.

 20140512 Talk
Universität Leipzig, Seminar room P 901 in Paulinum, Augustusplatz, 14:15

Title: Quantitative computation tree logic model checking based on possibility measure

Professor Dr. Yongming Li from Shaanxi Normal University (China)

Abstract:
We study generalized possibilistic computation tree logic model checking in this paper, which is an extension of possibilistic computation logic model checking introduced by the authors. The system is modeled by generalized possibilistic Kripke structures (GPKS, in short), and the verifying property is specified by generalized possibilistic computation tree logic (GPoCTL, in short) formula. Based on generalized possibility measure, the method of generalized possibilistic computation tree logic model checking is discussed, and the corresponding algorithm and its complexity are shown in detail. By product, the comparison between PoCTL introduced previously by the authors and GPoCTL is given. Finally, a thermostat example is given to illustrate the PoCTL modelchecking method.

 20140402 Talk
Technische Universität Dresden, INF/3027, 14:00

Title: Automated Reasoning: why computational complexity isn't all that matters

Analysing the computational complexity of reasoning problems and designing (analysing, implementing, optimising, ...) automated reasoning procedures are well established methodologies of research into logics. In this talk, I will try to explain related questions that matter when trying to use logics in the applications they have been designed to be used in, and whose investigation is rather less well established. In particular, I will give a brief introduction into computational complexity of logical problems, and then discuss what these results tell us about the behaviour of reasoning procedures. I will also talk about other aspects of the performance of reasoning procedures, including performance homogeneity/heterogeneity and cognitive complexity. The latter plays an important role for the explanation of reasoning results.

2013
 20130727 PhD defense
Technische Universität Dresden, WIL/B321, 15:00

Title: Relational Structure Theory  A Localisation Theory for Algebraic Structures

Abstract:
This thesis extends a localisation theory for finite algebras to certain classes of infinite structures. Based on ideas and constructions originally stemming from Tame Congruence Theory, algebras are studied via local restrictions of their relational counterpart (Relational Structure Theory). In this respect, first those subsets are identified that are suitable for such a localisation process, i. e. that are compatible with the relational clone structure of the counterpart of an algebra. It is then studied which properties of the global algebra can be transferred to its localisations, called neighbourhoods. Thereafter, it is discussed how this process can be reversed, leading to the concept of covers. These are collections of neighbourhoods that allow information retrieval about the global structure from knowledge about the local restrictions. Subsequently, covers are characterised in terms of a decomposition equation, and connections to categorical equivalences of algebras are explored. In the second half of the thesis, a refinement concept for covers is introduced in order to find optimal, nonrefinable covers, eventually leading to practical algorithms for their determination. Finally, the text establishes further theoretical foundations, e. g. several irreducibility notions, in order to ensure existence of nonrefinable covers via an intrinsic characterisation, and to prove under some conditions that they are uniquely determined in a canonical sense. At last, the applicability of the developed techniques is demonstrated using two clear expository examples.

 20130521 Seminar
Technische Universität Dresden, Willersbau, 15:00
 Title: The Hypergraph Transversal Problem: Applications and Complexity

Abstract:
Generating minimal transversals of a hypergraph, or, equivalently, dualizing a monotone DNF, is an important problem which has many applications in Computer Science. In this talk I will address this problem and its decisional variant, the recognition of the transversal hypergraph for another hypergraph. I will survey some results on problems which are known to be related to computing the transversal hypergraph, where I focus on problems in propositional Logic, formal concept analysis, databases, data mining, and Artificial Intelligence. I will then address the computational complexity of recognizing the transversal hypergraph. While the tractability and exact complexity of this problem have been open for over 30 years, it is known that the decision problem can be solved in quasipolynomial time, and in polynomial time with limited nondeterminism. Regarding the space complexity, it was recently shown that the problem is in quadratic logspace. I will also discuss large classes of instances on which the problem is known to be tractable.Speaker Bio:
Georg Gottlob is a Professor of Informatics at Oxford University, a Fellow of St John's College, Oxford, and an Adjunct Professor at TU Wien. His interests include data extraction, database theory, graph decomposition techniques, AI, knowledge representation, logic and complexity. Gottlob has received the Wittgenstein Award from the Austrian National Science Fund, is an ACM Fellow, an ECCAI Fellow, a Fellow of the Royal Society, and a member of the Austrian Academy of Sciences, the German National Academy of Sciences, and the Academia Europaea. He chaired the Program Committees of IJCAI 2003 and ACM PODS 2000. He is the main founder of Lixto, a company that provides tools and services for web data extraction. Gottlob was rexcently awarded an ERC Advanced Investigator's Grant for the project "DIADEM: Domaincentric Intelligent Automated Data Extraction Methodology". More information on Georg Gottlob can be found on his Web page.
 20130305 Seminar
Technische Universität Dresden, INF/3027, 13:00
 Title: Temporalised Description Logics for Monitoring Partially Observable Events

Abstract:
Daily life depends on complex hardware and software systems. In runtime monitoring, one (incompletely) observes the behaviour of the actual system in a structured way. This information is used to verify that the system has certain properties, which are formulated in a temporal logic, or to answer temporal queries to determine whether a certain situation has arisen. One approach to runtime monitoring is using temporalised Description Logics to formalise temporal properties and socalled ABoxes to capture incomplete observations of the system's behaviour.After a short introduction to runtime monitoring in general, we give an overview of the research field of temporalised Description Logics, and present some undecidability results and complexity results of the satisfiability problem in such logics. Moreover, we give a short introduction to query answering w.r.t. Description Logic knowledge bases. This talk concludes with some goals of the thesis and an outlook on what has been achieved so far.
 20130110 Seminar
Technische Universität Dresden, INF/3027, 10:00

Title: Reasoning with Vagueness in Description Logics using Automata and Tableaux

Abstract:
Description logics (DLs) are used to represent knowledge of an application domain and provide standard reasoning services to infer consequences of this knowledge. However, classical DLs are not suited to represent uncertainty about the knowledge or vagueness in the description of the knowledge.
In this talk, we will consider extensions of DLs that were developed to address these tasks. The focus lies on fuzzy DLs, which can be used to reason about vague concepts. We will follow the history of fuzzy DLs starting with the first reasoning procedures for simple fuzzy DLs.
More recently, more expressive fuzzy DLs based on mathematical fuzzy logic have been proposed. Some of the basic reasoning algorithms still work in this more general setting, but socalled general concept inclusion axioms can lead to problems and even cause undecidability of reasoning in some cases.
In the last part of the talk, the goal and methods of the thesis will be briefly described.

2012
 20121109 Seminar
Technische Universität Dresden, INF/3027
 9:00, Modularity in Ontologies
Dirk Walther from Universidad Politécnica de Madrid (UPM)
Abstract:
Nowadays, logical theories in guise of ontologies are designed for applications in bioinformatics, medicine, geography, linguistics and other areas. They are often based on description logics with wellunderstood and implemented reasoning problems and procedures. The challenges are now to provide automatic support for sharing and reuse of ontologies as well as their design and maintenance. A key concept in tackling the challenges is the notion of modularity. Recently we have seen the introduction of many related notions together with challenging technical results. This talk focusses on modeltheoretic notions of modularity and reasoning problems that are relevant for modularity. Algorithms for module extraction for the description logics EL and ALC will be discussed as well as their complexity and an experimental evaluation.  10:30, The Logical Difference for the Lightweight Description Logic EL
Michel Ludwig from University of Liverpool
Abstract:
We present results obtained from studying a logicbased approach to versioning of ontologies. Under this view, ontologies provide answers to queries about some vocabulary of interest. The difference between two versions of an ontology is given by the set of queries that receive different answers. We have investigated this approach for terminologies given in the description logic EL extended with role inclusions and domain/range restrictions for three distinct types of queries: subsumption, instance, and conjunctive queries. In all three cases, we present polynomialtime algorithms that decide whether two terminologies give the same answers to queries over a given vocabulary and compute a succinct representation of the difference if it is nonempty. We also present the tool CEX2.5, which is an implementation of the developed algorithms, and we analyze its performance during the experiments we have conducted with distinct versions of the SNOMED CT and the NCI ontologies.
 9:00, Modularity in Ontologies