Videos and presentation materials from other INI events are also available.
Event  When  Speaker  Title  Presentation Material 

LAAW01 
9th January 2006 09:00 to 10:00 
Model theoretic methods for special classes of (finite) structures  
LAAW01 
9th January 2006 10:00 to 11:00 
B Kujipers  Topological queries  topological invariants and query languages  
LAAW01 
9th January 2006 11:00 to 12:00 
Logical aspects of spatial databases  
LAAW01 
9th January 2006 13:00 to 14:00 
Asymptotics of definable sets in finite structres  
LAAW01 
9th January 2006 14:00 to 15:00 
T Wilke  Logics, automata, and finite semigroups  
LAAW01 
9th January 2006 15:00 to 16:00 
Towards understanding tree languages  part 1  
LAAW01 
9th January 2006 16:00 to 17:00 
M Djordjevic  Connections between finite and infinite model theory  
LAAW01 
9th January 2006 17:00 to 18:00 
Analysis of recursive Markov chains, recursive Markov decision processes, and recursive stochastic games  
LAAW01 
10th January 2006 13:00 to 14:00 
R Elwes  Asymptotics of definable sets in finite structures  
LAAW01 
10th January 2006 15:00 to 16:00 
Towards understanding tree languages  part 2  
LAAW01 
11th January 2006 13:00 to 14:00 
Asymptotics of definable sets in finite structures  
LAAW01 
12th January 2006 11:00 to 12:00 
Classifying automatic structures  
LAAW01 
12th January 2006 12:00 to 13:00 
(Finite) model theory of tree and treelike structures  
LAAW01 
12th January 2006 13:00 to 14:00 
(Finite) model of tree and treelike structures  
LAAW01 
13th January 2006 17:00 to 18:00 
The algebraic approach to infinitevalued constraint satisfaction  
LAA 
19th January 2006 11:00 to 12:00 
On the selectivity of a semantic subsumption index Subsumption is a way of saying that one first order clause is ''more general'' than another. Millions of pairs of clauses need to be tested for subsumption in a single run of a typical first order resolutionbased theorem prover. Deciding, for two given clauses, whether one of them subsumes another is NPhard and thus subsumption tests take a lot of time, usually more than half of the running time of a prover. Complicated data structures have been developed to support subsumption tests, including discrimination trees with their most advanced version, called code trees, implemented in Vampire. Since subsumption is a syntactic notion, it is the syntax of the clauses which is indexed in the trees. We show, that in some cases, a simple semantical indexing technique can possibly prove more efficient in this context, than the known complicated syntactic data structures. Our main experimental result is that: (i) in most (up to 99 %) of "practical" negative cases, the nonsubsumption can be shown by a counterexample being a structure consisting of 4 elements; (ii) if a logic with many enough truth values is used, then the counterexamples are relatively frequent, so they can be constructed by a random guess. Finally, we provide some evidence that a similar semantic idea can be successfully used for testing for AC matching, which is another NPcomplete problem whose millions of instances are being solved by theorem provers. 

LAA 
24th January 2006 11:00 to 12:00 
Epistemic logics for time and space bounded reasoning In standard epistemic logics, an agent's beliefs are modelled as closed under logical consequence. This becomes a problem if we are interested in the evolution of beliefs of a rational agent which requires time and memory to derive consequences of its beliefs. For example, we may be interested in expressing and verifying properties like `given its current beliefs and its memory size, the agent will/will not believe A in the next n time steps', where A may be a consequence of the agent's beliefs. The basic idea of our approach is as follows. The reasoning agent is modelled as a state transition system; each state is a finite set of formulas (agent's beliefs); the maximal size of this set is determined by the agent's memory size. Transitions between states correspond to applications of the agent's inference rules. The language of temporal epistemic logic can be interpreted in such structures in a straightforward way. Several interesting directions for research arise next, for example:  completeness and decidability of epistemic logics for particular kinds of reasoners (determined by their inference rules);  modelchecking: using MBP (modelbased planner, developed in Trento) to verify memory requirements for a classical reasoner and a rulebased reasoner;  expressive power: adding epistemic operators which allow us to express properties like `the agent only knows m formulas'. 

LAA 
26th January 2006 11:00 to 12:00 
Complexity of constraint problems via polymorphisms  
LAA 
31st January 2006 11:00 to 12:00 
M Valeriote 
Tractable constraint languages arising from some algebras that generate congruence distributive varieties A consequence of conjectures of LaroseZadori and of Bulatov is that any constraint language arising from a finite algebra that generates a congruence distributive variety is of bounded relational width and hence is tractable. Some special cases of this have been known for some time, for example, when the algebra has a majority or, more generally, a near unanimity term operation. A classic theorem of universal algebra (due to Jonsson) is that an algebra generates a congruence distributive variety if and only if it has a sequence of Jonsson terms. In my talk I will discuss the following theorem, produced with Emil Kiss: Theorem: Let A be a finite algebra that has a sequence of four Jonsson terms. Then the infinite constraint language consisting of all subuniverses of finite powers of A has bounded relational width and hence is globally tractable. 

LAA 
2nd February 2006 11:00 to 12:00 
Game semantics and its algorithmic applications: Part I  
LAA 
3rd February 2006 11:00 to 12:00 
Automatically proving the termination of C programs In this talk I will discuss Terminator, which is a program termination prover that supports large programs with arbitrarily nested loops or recursive functions, and imperative features such as references, functions with sideeffects, and function pointers. Terminator is based on a newly discovered method of counterexampleguided abstraction refinement for program termination proofs. The termination argument is constructed incrementally, based on failed proof attempts. Terminator also infers and proves required program invariants on demand. The lecture will close with information about the results of some experiments with Terminator on Windows device drivers. 

LAA 
6th February 2006 11:00 to 12:00 
A Atserias 
Nonuniform hardness for NP via blackbox adversaries We may believe SAT does not have small Boolean circuits. But is it possible that some language with small circuits looks indistiguishable from SAT to every polynomialtime bounded adversary? We rule out this possibility. More precisely, assuming SAT does not have small circuits, we show that for every language $A$ with small circuits, there exists a probabilistic polynomialtime algorithm that makes blackbox queries to $A$, and produces, for a given input length, a Boolean formula on which $A$ differs from SAT. This will be a blackboard lecture (no slides). 

LAA 
7th February 2006 11:00 to 12:00 
Game semantics and its algorithmic applications: Part II  
LAA 
9th February 2006 11:00 to 12:00 
Game semantics and its algorithmic applications: Part III  
LAA 
9th February 2006 16:00 to 17:00 
A Atserias  Part II on hardness for NP Informal Discussion  
LAA 
13th February 2006 11:00 to 12:00 
Avoiding determinization Nondeterministic automata may have several runs on every input, and an input is accepted if one of the runs is accepting. In contrast, deterministic automata have a single run on every input. The existential mode of nondeterministic automata is an obstacle in many applications. For example, complementing a nondeterministic automaton A cannot be done by dualizing its acceptance condition. Likewise, running A on all branches of a tree, which is needed in decision procedures for highorder logics, game solving, and opensystem synthesis, results in a tree automaton that accepts a subset of the trees all of whose branches are accepted by A. The standard solution for coping with this obstacle is to determinize A. This gives rise to yet another obstacle: determinization of automata on infinite words is very complicated. Safra's optimal determinization algorithm is highly nontrivial and indeed this approach proved to be not too amenable to implementation. We offer an alternative to the standard automatatheoretic approach. The crux of our approach is avoiding determinization. Our approach goes instead via universal automata. Like nondeterministic automata, universal automata may have several runs on every input. Here, however, an input is accepted if all of the runs are accepting. We show how the use of universal automata simplifies significantly known complementation constructions for automata on infinite words, known decision procedures for highorder logics, known synthesis algorithms, and other applications that are now based on determinization. Our algorithms are less difficult to implement and have practical advantages like being amenable to optimizations and a symbolic implementation. 

LAA 
14th February 2006 11:00 to 12:00 
Game semantics and its algorithmic applications: Part IV  
LAA 
15th February 2006 11:30 to 12:30 
Complete problems for higher order logics Let $i, j \geq 1$, and let $\Sigma^i_j$ denote the class of the higher order logic formulas of order $i+1$ with $j1$ alternations of quantifier blocks of variables of order $i+1$, starting with an existential quantifier block. There is a precise correspondence between the non deterministic exponential time hierarchy and the different fragments of higher order logics $\Sigma^i_j$, namely $NEXP^j_i= \Sigma^{i+1}_j$. In this article we present a complete problem for each level of the non deterministic exponential time hierarchy, with a very weak sort of reductions, namely quantifierfree first order reductions. Moreover, we don't assume the existence of an order in the input structures in this reduction. From the logical point of view, our main result says that every fragment $\Sigma^i_j$ of higher order logics can be captured with a first order logic Lindstrom quantifier. Moreover, as our reductions are quantifierfree first order formulas, we get a normal form stating that each $\Sigma^i_j$ sentence is equivalent to a single occurrence of the quantifier and a tuple of quantifierfree first order formulas. Our complete problems are a generalization of the well known problem quantified Boolean formulas with bounded alternation. 

LAA 
16th February 2006 11:00 to 12:00 
Game semantics and its algorithmic applications: Part V  
LAA 
17th February 2006 11:00 to 12:00 
A class of SATinstances which are hard for resolution based SATsolvers In this talk we will construct a class of SATformulae based on Eulerian graphs. The satisfiablility of these formulae depend on the number of edges in the Eulerian graph and it is easy to construct extremely similar formulae which differ in satisfiabillity. The structure of the graph can also be used to tune the formulae to be resitant to several modern enhancements of the basic DPLLalgorithm. Some possible connections to the hardness of random KSAT formulae will also be mentioned. 

LAA 
20th February 2006 11:00 to 12:00 
The computational complexity of quantified constraint satisfaction The constraint satisfaction problem (CSP) is a wellacknowledged framework for modelling and solving computational search problems. A more expressive framework that generalizes the CSP is the quantified constraint satisfaction problem (QCSP). Problems from areas as diverse as graph coloring, game playing, database theory, and nonmonotonic reasoning can be naturally formulated in these frameworks. Both of these problems are, in their general formulation, computationally intractable; the CSP is NPcomplete, whereas the QCSP is PSPACEcomplete. This intractability has motivated the pursuit of restricted cases of these problems that are tractable. One way to restrict these problems is by prespecifying the types of constraints that may appear. This line of work has its origins in a 1978 paper of Schaefer, who achieved a complexity classification of constraints over a twoelement domain, but left open the question of such a classification over domains of larger size. Over the past decade, there has been dramatic progress on this question, largely due to a link between universal algebra and CSP complexity that has been identified. This talk demonstrates how, with this link, algebraic techniques can be used to study QCSP complexity. We develop a new methodology for proving QCSP tractability results called collapsibility, which allows us to  unify and identify common structure among the initial QCSP tractability results, which were proved using disparate proof techniques (in different decades!), and  derive new QCSP tractability results. Time permitting, we will also discuss recent progress on the case of 3element QCSP. 

LAA 
21st February 2006 11:00 to 12:00 
Biological systems as reactive systems Systems Biology is a new discipline aiming to understand the behavior of biological systems as it results from the (nontrivial, "emergent") interaction of biological components. We discuss some biological networks that are characterized by simple components, but by complex interactions. The components are separately described in stochastic picalculus, which is a "programming language" that should scale up to description of large systems. The components are then wired together, and their interactions are studied by stochastic simulation. Subtle and unexpected behavior emerges even from simple circuits, and yet stable behavior emerges too, giving some hints about what may be critical and what may be irrelevant in the organization of biological networks. 

LAA 
22nd February 2006 11:00 to 12:00 
Syntactic vs. semantic approximations to logics that capture complexity classes I will formulate a formal syntax of approximate formulas for the logic with counting quantifiers, SOLP, which is essentially a first order language extended with quantifiers that act upon second order variables of a given arity r, and count the fraction of elements in a subset of rtuples of a model that satisfy a formula. Some facts about SOLP}: (i) In the presence of a builtin (linear) order, it can describe NPcomplete problems and fragments of it capture classes like P and NL; (ii) weakening the ordering relation to an almost order we can separate meaningful fragments, using a combinatorial tool suited for these languages. The purpose of the approximate formulas is to provide a syntactic approximation to logics contained in SOLP with builtin order, that should be complementary of the semantic approximation based on almost orders, by producing approximating logics where problems are described within a small counting error. I will introduce a concept of strong expressibility based on approximate formulas, and show that for many fragments of SOLP with builtin order, including ones that capture P and NL, expressibility and strong expressibility are equivalent. I will state a Bridge Theorem that links expressibility in fragments of SOLP over almostordered structures to strong expressibility with respect to approximate formulas for the corresponding fragments over ordered structures. A consequence of these results is that proving inexpressibility results over fragments of SOLP with builtin order can be done by proving inexpressibility over the corresponding fragments with builtin almost order, where separation proofs are easier. 

LAA 
23rd February 2006 11:00 to 12:00 
Beyond Hypertee Width: Decomposition methods without decompositions The general intractability of the constraint satisfaction problem has motivated the study of restrictions on this problem that permit polynomialtime solvability. One major line of work has focused on structural restrictions, which arise from restricting the interaction among constraint scopes. In this talk we consider generalized hypertree width, a structural measure that has up to recently eluded study. We give a simple proof of the tractability of instances having bounded generalized hypertree width. 

LAA 
24th February 2006 11:00 to 12:00 
Complete problems for higher order logics Let $i, j \geq 1$, and let $\Sigma^i_j$ denote the class of the higher order logic formulas of order $i+1$ with $j1$ alternations of quantifier blocks of variables of order $i+1$, starting with an existential quantifier block. There is a precise correspondence between the non deterministic exponential time hierarchy and the different fragments of higher order logics $\Sigma^i_j$, namely $NEXP^j_i= \Sigma^{i+1}_j$. In this article we present a complete problem for each level of the non deterministic exponential time hierarchy, with a very weak sort of reductions, namely quantifierfree first order reductions. Moreover, we don't assume the existence of an order in the input structures in this reduction. From the logical point of view, our main result says that every fragment $\Sigma^i_j$ of higher order logics can be captured with a first order logic Lindstrom quantifier. Moreover, as our reductions are quantifierfree first order formulas, we get a normal form stating that each $\Sigma^i_j$ sentence is equivalent to a single occurrence of the quantifier and a tuple of quantifierfree first order formulas. Our complete problems are a generalization of the well known problem quantified Boolean formulas with bounded alternation. 

LAAW02 
27th February 2006 09:30 to 11:00 
The automaton approach to XML schema languages: from practice to theory  
LAAW02 
27th February 2006 11:30 to 12:30 
Probabilities in databases and in logics I This talk discusses two applications of probabilities in databases, and their corresponding problems in probabilistic logic. The first application is managing imprecise data: here each tuple in the database becomes a probabilistic event, and query evaluation reduces to the problem of computing the probability that a sentence is true. This is the model that has been used in "probabilistic databases" over the last twenty years. I will discuss several aspects of conjunctive query evaluation, including the data complexity (#Pcomplete) and a provably efficient approximation algorithms for topk queries. The second application is to manage incompleteness: here the database is not known, but instead is learned through a set of observations, such as views or statistics. Examples include query answering using views, measuring information leakage, and cardinality estimation from data statistics. Under the Baysian approach to incompleteness a prior probability distribution is assumed on all possible databases and the set of observations are used to condition the prior. Query evaluation reduces to the problem of computing a conditional probability: in fact, for practical applications we need to compute the limit of this conditional probability when the domain size growth to infinity. I will describe an explicit method for computing this limit conditional probability for conjunctive queries, then present the complexity of various decision problems associated to it. This is joined work with Nilesh Dalvi. 

LAAW02 
27th February 2006 14:00 to 15:00 
Foundations of Schema mappings I  
LAAW02 
27th February 2006 15:30 to 16:15 
V Sazonov 
Querying hyperset/weblike databases A hyperset approach to Weblike (semistructured) databases and query language Delta are presented in a simple, intuitive and quite precise way, hopefully for a wider audience, with demonstrating how this abstract view can be represented in concrete terms of set equations allowing cycles. This approach answers the question on how expressive power of querying to semistructured databases can be characterised both in terms of descriptive complexity and by some demonstrating examples such as defining in terms of Delta the formal semantics of path expressions, as well as of other known languages UnQL and UnCAL. 

LAAW02 
27th February 2006 16:15 to 17:00 
Concentration bounds for Markov processes of metafinite models A formal model of discrete dynamical systems with probabilistic updates is presented. States of a system are metafinite models, and update rules are essentially Abstract State Machine rules augmented with a random function. Given an initial state, a sequence of updates generates a trajectory in the phase space of the system. It is shown that if the states in the trajectory satisfy certain locality conditions, then logically definable random variables of the states are concentrated around their expectation. That is, with high probability, their values are close to their average value. 

LAAW02 
27th February 2006 17:00 to 17:45 
XPath with transitive closure We determine the expressivity of the expansion of Core XPath with a reflexive transitive closure operator. This expansion is known under the name "Regular XPath". It allows us to write expressions as (child::*/child::*)+ meaning "give me all nodes which are an even number of steps below me", and filter expressions like "I have an even number of descendants". The transitive closure operator is NOT part of the XPath 1.0 or 2.0 standards, but is available in the XSLT processors SAXON and XALAN. Regular XPath can be seen as a variant of Propositional Dynamic logic, with four programs (child, parent, left, right) and propositional variables corresponding to XML tag names. Core XPath corresponds to a fragment of PDL in which the Kleene star may only be applied to atomic programs. Interestingly, in order to obtain a nice characterization of Regular XPath in terms of first order logic expanded with transitive closure, we need to add something more to Core XPath. This extra addition can be phrased in a number of different ways. One is the following: if P and Q are path expressions, then P=Q is a filter expression. P=Q is true at a node n iff there exists a node m which is reachable from n by both P and Q. At present we do not know if this extra addition is really needed. 

LAAW02 
28th February 2006 09:30 to 10:15 
Cardinalitybased semantics for consistent query answering: incremental and parameterized complexity Consistent Query Answering (CQA) is the problem of computing from a database the answers to a query that are consistent with respect to certain integrity constraints, that the database, as a whole, may fail to satisfy. Consistent answers have been characterized as those that are invariant under certain minimal forms of restoration of the consistency of the database. In this paper we investigate algorithmic and complexity theoretic issues of CQA under database repairs that minimally depart wrt the cardinality of the symmetric difference from the original database. Research on this kind of repairs had been suggested in the literature, but no systematic study had been done. Here we obtain first tight complexity bounds. We also address, considering for the first time a dynamic scenario for CQA, the problem of incremental complexity of CQA, that naturally occurs when an originally consistent database becomes inconsistent after the execution of a sequence of update operations. Tight bounds on incremental complexity are provided for various semantics under denial constraints, e.g. (a) minimum tuplebased repairs wrt cardinality, (b) minimal tuplebased repairs wrt set inclusion, and (c) minimum numerical aggregation of attributebased repairs. Fixed parameter tractability is also investigated in this dynamic context, where the size of the update sequence becomes the relevant parameter. Related Links 

LAAW02 
28th February 2006 10:15 to 11:00 
Computational problems of data exchange Data Exchange is the problem of inserting data structured under a source schema into a target schema of different structure (possibly with integrity constraints), while reflecting the source data as accurately as possible. We study computational issues related to data exchange in the setting of Fagin, Kolaitis, and Popa (PODS´03). In particular, we are interested in computing a particular universal solution to a data exchange problem, called "the core". We present polynomial algorithms for computing the core of large classes of data exchange problems, solving relevant open problems by Fagin et al. Our results show that data exchange with cores is feasible in a very general framework. Furthermore, we use the technique of hypertree decompositions to derive improved algorithms for computing the core of a relational instance with labeled nulls, a problem we show to be fixedparameter intractable with respect to the block size of the input instances. Finally, we show that computing cores is NPhard in presence of a systempredicate NULL(x), which is true if x is a null value. Part of this work is joint work with Alan Nash, UCSD. 

LAAW02 
28th February 2006 11:30 to 12:30 
Probabilities in databases and in logics II  
LAAW02 
28th February 2006 14:00 to 15:00 
Foundations of Schema mappings II  
LAAW02 
28th February 2006 15:30 to 16:15 
Context logic and tree update Spatial logics have been used to describe properties of treelike structures (Ambient Logic) and in a Hoare style to reason about dynamic updates of heaplike structures (Separation Logic). A natural question is whether the Ambient Logic can be used to reason about tree update (XML update). Calcagno, Zarfaty and I have shown that this is not possible. Instead, we had to fundamentally change the way we reason about structured data by introducing Context Logic. Local data update typically identifies the portion of data to be replaced, removes it, and inserts the new data in the same place. With Context Logic, we can reason about both data and this place of insertion (contexts). We have shown that Context Logic can be used to develop local Hoare reasoning about tree update, heap update (analogous to Separation Logic reasoning), and term rewriting (which escaped reasoning using Separation Logic). In this talk, I will motivate Context Logic using our local Hoare reasoning about tree update (XML update). However, my main goal is to explain some expressivity results. With our style of Hoare reasoning, certain socalled adjunct connectives play an essential role in describing the weakest preconditions (important for both the theory and tool development). These connectives can be eliminated in the tree case (without quantifiers). This result follows previous results on Separation Logic and Ambient Logic, by Lozes first and then Dawar, Gardner and Ghelli. Such results are interesting, since there is typically no simple translation from a formula to its adjunctfree equivalent and, in the case of the Ambient Logic with hiding, the task of finding the adjunctfree formulae is not computable. Despite these elimination results, our intuition has always been that the adjunct connectives are essential for Hoare reasoning. The point is that the weakest preconditions are parametric with respect to a formula variable. We have recently developed inexpressibility results, which characterise what it means for a logic to be unable to express such parametric properties. We have not seen this style of parametric inexpressibility result before, and believe it is an important general concept that will be applicable to other styles of logical reasoning. 

LAAW02 
28th February 2006 16:15 to 17:00 
M Katz 
Approximation logic and databases The Logic of Approximation is an infinitevalued first order logic, designed to handle inexactness in scientific research. Truthvalues of this logic, ranging in the closed interval [0,1], are 'degrees of error'. Thus 0 (no error) stands for absolute truth and 1 (maximal error) for absolute falsehood. The semantic rules for connectives and quantifiers are reversals of familiar rules of fuzzy logics. Unique to this logic is a notion of deduction based on the idea that errors in the conclusion can be minimized (to within arbitrary epsilon) if errors in the premiss are controlled (to within appropriate delta). In the past this logic was developed and applied by the author to geometry, quantum theory, measurement theory and utility theory. In the present paper it is to be applied to the study of relational databases. In particular we shall try to show how to handle, within the framework of the logic of approximation, various properties of specific databases derived from batteries of intelligence tests and from surveys of social attitudes. 

LAAW02 
28th February 2006 17:00 to 17:45 
A normal form for singulary logic over physically realizable data models We present a new normal form for firstorder logic over boundeddegree structures which relies on the symmetry of labeled connections between elements of the structure. The key idea is to use vocabularies with only monadic predicate and monadic function symbols  so called singulary logic [Church]. Our proof is effective and syntactic, not relying on games or other semantic characterizations. Lineartime computability of firstorder queries becomes a direct corollary of this result. Included will be a discussion of why these data structures are the appropriate models for physically realizable databases. [Church, A.] Introduction to Mathematical Logic, Princeton University Press, 1956. Related Links


LAAW02 
1st March 2006 09:15 to 10:15 
Queries on treestructured data: Logical languages and complexity In this talk, I will survey recent results on the theory  and in particular complexity  of a number of popular logical query languages for treestructured data and XML, including XPath, conjunctive and firstorder queries, and XQuery. 

LAAW02 
1st March 2006 10:15 to 11:00 
BPQL  A query language for business processes This talk presents BPQL, a novel graphical query language for querying Business Processes, implemented as a set of cooperating web services. BPQL is based on an intuitive model of business processes, an abstraction of the emerging BPEL (Business Process Execution Language) standard. It allows users to query business processes visually, in a manner analogous to how such processes are typically specified, and can be employed in a distributed setting, where process components may be provided by distinct providers(peers). The talk describes the query language as well as its underlying formal model. Special emphasis is given to the following subjects: (a) The analogy between the specification and querying of business processes, and (b) the use of graph grammars to represent potentially infinite query results by a finite and concise representation. The current implementation, using Active XML and Web services is briefly described. 

LAAW02 
1st March 2006 11:30 to 12:30 
Pebble tree walking automata and TClogics on trees  
LAAW02 
2nd March 2006 09:15 to 10:15 
The complexity of processing data streams and external memory data I  
LAAW02 
2nd March 2006 10:15 to 11:00 
Ontology mediated data management Ontologies provide a conceptualization of a domain of interest. Nowadays, they are typically represented in terms of Description Logics, and are seen as the key technology used to describe the semantics of information at various sites. The idea of using ontologies as a conceptual view over data repositories is becoming more and more popular. In order for this idea to become widespread in standard applications, it is fundamental that the conceptual layer through which the underlying data layer is accessed does not introduce a significant overhead in dealing with the data. Based on these observations, in recent years techniques to access data sources through a conceptual layer built on top of them have been developed, and the computational complexity of answering queries over ontologies, measured in the size of the underlying data, has been studied. The talk is divided in two parts. The first part will present the general ideas underlying ontology mediated data management, and will discuss the tradeoff between expressive power of the ontology language, and efficiency in query processing. We will specifically address the possibility of delegating query processing over ontologies to database engines of increasing power (from relational, to linear recursion, to arbitrary recursion, to disjunctive Datalog). In the second part, we will provide a concrete example of a system for ontology mediated data management on top of a relational engine. The system is implemented in the prototype tool QuOnto. The talk will be jointly given by Diego Calvanese and Giuseppe De Giacomo. 

LAAW02 
2nd March 2006 11:30 to 12:30 
Automatic verification of communicating dataaware web services This talk presents recent work at UC San Diego on the verification of compositions of web service peers which interact asynchronously by exchanging messages. Each peer can be viewed as a dataaware reactive system. It has access to a local database and reacts to user input and incoming messages by performing various actions and sending messages. The reaction is described by queries over the database, internal state, user input and received messages. We consider two formalisms for specification of correctness properties of compositions, namely Linear Temporal FirstOrder Logic and Conversation Protocols. For both formalisms, we map the boundaries of verification decidability, showing that they include expressive classes of compositions and properties. We also address modular verification, in which the correctness of a composition is predicated on the properties of its environment. Finally, we present encouraging experimental results on verification of single peers (WAVE project). These suggest that automatic verification may be practically feasible for a large class of dataaware web services. Related Links


LAAW02 
2nd March 2006 14:00 to 15:00 
Database normalization revisited: an informationtheoretic approach  
LAAW02 
2nd March 2006 15:30 to 16:15 
S Szeider 
On the cliquewidth of graphs Cliquewidth is a graph parameter that measures in a certain sense the complexity of a graph. It is known that MSO_1 queries can be evaluated in polynomial time on classes of graphs with effectively bounded cliquewidth. Here MSO_1 denotes the fragment of Monadic Second Order Logic with secondorder quantification on vertex sets. NPhard problems like 3colorability can be expressed as MSO_1 queries. This talk will survey the concept of cliquewidth, its advantages and its limits. In particular, new NPhardness results for the recognition of graphs with bounded cliquewidth will be presented; these results were recently obtained in joint work with M.R. Fellows, F. Rosamond, and U. Rotics. 

LAAW02 
2nd March 2006 16:15 to 17:00 
Complete axiomatisation for PDLpath PDLpath is a logic introduced in [1] to express path constraints on graphs of semistructured data. Its language contains one nominal r, which is true at the root node, standard PDL modalities, converse, and a new modality #, which stands for "some edge label". We assume that the language contains countably infinitely many atomic modalities or labels. We provide a complete axiomatisation of PDLpath and a new decidability proof for it (another proof is in [1]). [1] N. Alechina, S. Demri and M. de Rijke. A modal perspective on path constraints. Journal of Logic and Computation, 6(3), 2003, pages 939956. 

LAAW02 
2nd March 2006 17:00 to 17:45 
M Arenas 
Locally consistent transformations and query answering in data exchange Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema. Given a source instance, there may be many solutions  target instances that satisfy the constraints of the data exchange problem. Previous work has identified two classes of desirable solutions: canonical universal solutions, and their cores. Query answering in data exchange amounts to rewriting a query over the target schema to another query that, over a materialized target instance, gives the result that is semantically consistent with the source. A basic question is then whether there exists a transformation sending a source instance into a solution over which target queries can be answered. In this talk, we show that the answer is negative for many data exchange transformations that have structural properties similar to the canonical universal solution and the core. Namely, we prove that many such transformations preserve the "local" structure of the data. Using this notion, we further show that every target query rewritable over such a transformation cannot distinguish tuples whose neighborhoods in the source are similar. This gives us a first tool that helps check whether a query is rewritable. This is joint work with Pablo Barcelo, Ronald Fagin and Leonid Libkin. 

LAAW02 
3rd March 2006 09:15 to 10:15 
The complexity of processing data streams and external memory data  
LAAW02 
3rd March 2006 10:15 to 11:00 
Temporal logics for nary queries over XML documents  
LAAW02 
3rd March 2006 11:30 to 12:30 
Law enforcement on hypergraphs  
LAA 
6th March 2006 11:00 to 12:00 
Constraint satisfaction problems and dualities We consider constraint satisfaction (or homomorphism) problems of the form CSP(B): given a structure A, is there a homomorphism from A to a fixed structure B? The concept of duality for CSPs is based on the idea of providing an obstruction set (of structures) for homomorphism to B. There exist several types of duality according to the properties of obstruction sets. In this talk, we will review three main types of duality and discuss the problem of characterising structures with a given type of duality. 

LAA 
7th March 2006 11:00 to 12:00 
Basic proof complexity I In four talks I plan to cover some basic, both classical as well as recent, concepts of proof complexity. An approximate outline is: (1) CookReckhow definition, lengths of proofs, link to the NP versus coNP problem. Polynomial simulation, optimal proof systems. An overview of known lower bounds. (2) Theories being "uniform" proof systems. Translations of arithmetical proofs into propositional proofs. Interpretation of lower bounds as constructions of models of bounded arithmetic theories. (3) Proof complexity of (propositional) resolution. The sizewidth tradeoff, feasible interpolation, infinitary characterizations of hard tautologies. (4) Hard tautologies: combinatorial principles, consistency statements, proof complexity generators. Related Links


LAA 
8th March 2006 11:00 to 12:00 
On asymptotic classes of finite structures I will discuss joint work with Dugald Macpherson, and that of some of his recent Ph.D. students, on the development of model theory for some classes of finite structures. The point of view that informs the definitions of these classes, which we call finite asymptotic classes, and that we bring to their study reflects the outlook of current model theory for infinite structures. In fact, there are specific links to infinite model theory that I will mention. Time permitting, I also will discuss some of the ongoing work of Macpherson and myself that aims to extend the framework that is the main subject of the talk. 

LAA 
9th March 2006 11:00 to 12:00 
Basic proof complexity II In four talks I plan to cover some basic, both classical as well as recent, concepts of proof complexity. An approximate outline is: (1) CookReckhow definition, lengths of proofs, link to the NP versus coNP problem. Polynomial simulation, optimal proof systems. An overview of known lower bounds. (2) Theories being "uniform" proof systems. Translations of arithmetical proofs into propositional proofs. Interpretation of lower bounds as constructions of models of bounded arithmetic theories. (3) Proof complexity of (propositional) resolution. The sizewidth tradeoff, feasible interpolation, infinitary characterizations of hard tautologies. (4) Hard tautologies: combinatorial principles, consistency statements, proof complexity generators. Related Links


LAA 
10th March 2006 11:00 to 12:00 
Metarouting: An algebraic approach to defining routing protocols Semirings have been used in the past thirty years as a general framework in which to describe routing problems and routing algorithms. Some of the semiring axioms can be relaxed to arrive at "routing algebras" (Sobrinho 2003), which can model the very complex poilcybased routing used in the Internet today. Metarouting is centered around defining a metalanguage for routing algebras  a language where algebraic properties (such as monotonicity) needed for convergence guarantees can be automatically derived much like types are derived in programming languages. 

LAA 
13th March 2006 11:00 to 12:00 
G Kun 
CSP and MMSNP are computationally equivalent We introduce the notion of expander hypergraphs as one natural generalisation of expander graphs. We give an efficient construction of expander hypergraphs. This new tool makes possible to derandomise Feder and Vardi's reduction of CSP to large girth CSP. This implies the derandomisation of the computational equivalence between the classes CSP and MMSNP (Monotone monadic SNP). Theorem: For every language L in MMSNP there are relational structures S1, ... ,Sn such that the following hold. (1) L has a polynomial reduction to the union of the languages CSP(Si). (2) For every i the language CSP(Si) has a polynomial reduction to L. 

LAA 
14th March 2006 11:00 to 12:00 
Basic proof complexity III In four talks I plan to cover some basic, both classical as well as recent, concepts of proof complexity. An approximate outline is: (1) CookReckhow definition, lengths of proofs, link to the NP versus coNP problem. Polynomial simulation, optimal proof systems. An overview of known lower bounds. (2) Theories being "uniform" proof systems. Translations of arithmetical proofs into propositional proofs. Interpretation of lower bounds as constructions of models of bounded arithmetic theories. (3) Proof complexity of (propositional) resolution. The sizewidth tradeoff, feasible interpolation, infinitary characterizations of hard tautologies. (4) Hard tautologies: combinatorial principles, consistency statements, proof complexity generators. Related Links


LAA 
15th March 2006 11:00 to 12:00 
P Hell 
Graph partitions I will discuss vertex partitions V1, V2,..., Vk, where each Vi is a clique, a stable set, or an arbitrary set, and each pair Vi, Vj is completely adjacent, completely nonadjacent, or arbitrary. Many common partitions in the study of perfect graphs have such structure. The main goal is to decide when the existence of such a partition can be characterized by a finite set of forbidden induced subgraphs. These partitions can be viewed as restricted constraint satisfaction problems; connections to the dichotomy conjecture will also be discussed. 

LAA 
16th March 2006 11:00 to 12:00 
Basic proof complexity IIII In four talks I plan to cover some basic, both classical as well as recent, concepts of proof complexity. An approximate outline is: (1) CookReckhow definition, lengths of proofs, link to the NP versus coNP problem. Polynomial simulation, optimal proof systems. An overview of known lower bounds. (2) Theories being "uniform" proof systems. Translations of arithmetical proofs into propositional proofs. Interpretation of lower bounds as constructions of models of bounded arithmetic theories. (3) Proof complexity of (propositional) resolution. The sizewidth tradeoff, feasible interpolation, infinitary characterizations of hard tautologies. (4) Hard tautologies: combinatorial principles, consistency statements, proof complexity generators. Related Links


LAA 
17th March 2006 11:00 to 12:00 
Infinite state model checking in modal logic In this talk will discuss why model checking of basic modal logic and extensions of it on infinite state systems is interesting and important. In particular, I will present the case of model checking on `rational Kripke models' and its possible applications to regular model checking. Time permitting, I will also discuss how minimal extensions of basic modal logic can be used to specify and verify reachability problems in infinite state transition systems. 

LAA 
17th March 2006 14:15 to 15:15 
Logic and Algorithms What on earth does the obscure, old intellectual discipline have to do with the youngest intellectual discipline of computer science? In this overview talk I will explain not only how logic begat computer science, but also how logic has permeated computer science over the last few days so much that it has been called "the calculus of computer science." I will particularly focus on computationalcomplexity theory and its possible connections with statistical mechanics. 

LAA 
27th March 2006 11:00 to 12:00 
S Szeider 
Fixedparameter algorithms for propositional satisfiability and constraint satisfaction The framework of parameterized complexity provides a framework for dealing with NPhard problems. The idea is to associate with a problem instance a parameter, usually a nonnegative integer k, which is small for instances of relevance, and to develop exact algorithms with running times that are possibly exponential in the parameter but uniformly polynomial in the instance size. The majority of combinatorial problems studied in the framework of parameterized complexity suggest a "natural" parameter. For example, a natural parameter for the VERTEX COVER problem is an upper bound on the size of the vertex cover one searches for. However, problems such as propositional satisfiability and constraint satisfaction do not suggest a single natural parameter; in fact, there are numerous possibilities for parameters that one can consider. The identification of parameters that are as general as possible and which are still accessible to fixedparameter algorithms is an important research objective. In the talk we will survey recent results on fixedparameter algorithms for propositional satisfiability and constraint satisfaction. In particular, we will consider parameters that bound the cyclicity of instances, and parameters that bound the distance of instances from a polynomialtime solvable base class. 

LAA 
28th March 2006 11:00 to 12:00 
Small extensions (Very small) extension properties of finite structures (such as naturally defined partial orders) are surprisingly tightly related to algorithmic questions and descriptive complexity. Key Words: generic structure, homomorphism, order, density, gaps and duality. 

LAA 
30th March 2006 11:00 to 12:00 
Verification and changeimpact analysis of accesscontrol policies Sensitive data are increasingly available online through the Web and other distributed protocols. System designers often create policies to capture conditions on the access to data. To reduce source clutter and improve maintenance, developers increasingly use domainspecific, declarative languages to express these policies. In turn, administrators need to analyze policies relative to properties, and to understand the effect of policy changes even in the absence of properties. This talk discusses models and techniques that support both kinds of analyses for rolebased accesscontrol policies. It discusses Margrave, a software tool that implements these analyses for standalone policies, and discusses work in progress on models that also account for the dynamic environment in which policies execute. This extension enables reasoning about interactions between policies and the programs that employ them, but using analyses that extend beyond conventional model checking. 

LAA 
3rd April 2006 11:00 to 12:00 
Ubiquitous computing: shall we understand it? Ubiquitous computing (ubicomp) is a vision of hardware/software systems that pervade our environment, doing what we want without our continual direction. For example, consider a motorway for driverless vehicles. One of the UK Grand Challenges for Computing Research addresses ubicomp not only from the visionary angle, but also from the angle of design principles and theories that will support it. Ubicomp systems will exceed those that we know by orders of magnitude. There seems little chance of extrapolating existing methods of software production to cope with them. Ubicomp offers a challenge and an opportunity to develop a new computer science that interweaves three ingredients  vision, design and theory  much more intimately than ever before. We need this, to be confident in unleashing ubicomp systems. That's the Grand Challenge; the lecture will explore how to address it. One idea is via foothill projects, which address modest aims but try to combine segments of vision, design and theory. I shall illustrate this with a mathematical model of mobility that links the physical space of sensors and buildings with the virtual space of software processes. If nothing else, this exercise shows that we have a long way to go if we want to understand mobile ubiquitous systems. 

LAA 
4th April 2006 11:00 to 12:00 
Approximation schemes for firstorder definable optimisation problems  
LAA 
6th April 2006 11:00 to 12:00 
Games as controlled optimization problems  
LAA 
7th April 2006 11:00 to 12:00 
The semantics of modularity  
LAAW04 
10th April 2006 10:00 to 11:00 
P Pudlak 
$On \forall\Sigma_1^b$ sentences provable in bounded arithmetic We shall give a characterization of \forall\Sigma_1^b sentences provable in theories S_2^i. We shall consider also certain unrestricted versions of the principles that characterize these sentences and prove that they require iterated exponential time. This gives partial evidence that the hierarchy of \forall\Sigma_1^b sentences provable in S_2^i is strictly increasing. 

LAAW04 
10th April 2006 11:30 to 12:00 
The complexity of proving the discrete Jordan Curve theorem and related principles The Jordan Curve Theorem states that a simple, closed curve divides the plane into exactly two connected components. We consider the ``discrete'' version of this theorem (JCT) where the curve lies in an nxn grid. The Theorem can be formalized as either true statements in Bounded Arithmetic or families of propositional tautologies. Here we show that the formalizations in Bounded Arithmetic are provable in some weak theories (i.e., V^0 and V^0(2)); this implies that the families of tautologies have short proofs in the corresponding propositional proof systems. First, we formalize the theorem when the simple curve is given as a sequence of edges on the grid. We show that in this case the theorem can be proved in the theory V^0. Next, we consider the case where the input is a set of edges which generally form multiple simple closed curves. The interesting conclusion here is that the set of edges divides the plane into at least two connected components. We show that this is indeed provable in the theory V^0(2). The related principles include the STCONN on grid graph considered by Buss [Polynomialsize Frege and Resolution Proofs of stConnectivity and Hex Tautologies] and the nonplanarity of K3,3. STCONN is reducible to JCT, and thus it is provable in V^0(2). This improves Buss's result (that STCONN has polysize proof in TC^0Frege). 

LAAW04 
10th April 2006 12:00 to 12:30 
Uniform proof complexity One main aim in Propositional Proof Complexity is to prove strong lower bounds for a general propositional proof system. Inspired by this program, we will introduce uniform reducts of propositional proof systems. Uniform reducts are sets of first order formulas of arithmetic which are defined using translations from first order formulas to propositional formulas. We will describe basic properties of uniform reducts, discuss their relationship to Cook's Programme, and explain some more advanced features of them. Recently, Steve Cook made an obervation relating the study of uniform reducts to the existence of optimal proof systems  if time permits we will touch on this as well. 

LAAW04 
10th April 2006 14:00 to 15:00 
$T^1_2, T^2_2$ and search problems We give some new, natural principles characterizing the \Sigma^b_1 consequences of these theories, based on a reflection principle for resolution. 

LAAW04 
10th April 2006 15:30 to 16:00 
E Jerabek 
Approximate counting in bounded arithmetic We use the dual weak pigeonhole principle to develop approximate counting in bounded arithmetic. As an application, we show how to formalize classes of randomized algorithms, such as BPP. 

LAAW04 
10th April 2006 16:00 to 16:30 
When can $S^1_2$ prove the weak pigeonhole principle? It is well known result of Krajicek and Pudlak that if $S^1_2$ could prove the injective weak pigeonhole principle for every polynomial time function then RSA would not be secure. In this talk, we will consider function algebras based on a common characterization of the polynomial time functions where we slightly modify the initial functions and further restrict the amount of allowable recursion. We will then argue that $S^1_2$ can prove the surjective weak pigeonhole principle for functions in this algebra. 

LAAW04 
11th April 2006 09:30 to 10:30 
S Cook 
Capturing complexity classes by their reasoning power For each of many standard complexity classes within NP we associate a canonical set of universal and existential theorems which are those which can be proved using concepts in the class. In some cases we can also associate canonical proof systems in the quantified propositional calculus. 

LAAW04 
11th April 2006 10:30 to 11:00 
Constraint propagation as a proof system Refutation proofs can be viewed as a special case of constraint propagation, which is a fundamental technique in solving constraintsatisfaction problems. The generalization lifts, in a uniform way, the concept of refutation from Boolean satisfiability problems to general constraintsatisfaction problems. On the one hand, this enables us to study and characterize basic concepts, such as refutation width, using tools from finitemodel theory. On the other hand, this enables us to introduce new proof systems, based on representation classes, that have not been considered up to this point. We consider ordered binary decision diagrams (OBDDs) as a case study of a representation class for refutations, and compare their strength to wellknown proof systems, such as resolution, the Gaussian calculus, cutting planes, and Frege systems of bounded alternationdepth. In particular, we show that refutations by ODBBs polynomially simulate resolution and can be exponentially stronger. Joint work with Albert Atserias and Phokion Kolaitis. 

LAAW04 
11th April 2006 11:30 to 12:00 
The proof complexity of matrix algebra There are fast parallel algorithms for computing the characteristic polynomial of a matrix: notably Berkowitz's and Csanky's algorithms. While these algorithms run in NC2, all the known proofs of correctness require polynomial time concepts. Are there quasipolynomial Frege (i.e., NC2Frege) proofs of correctness of these algorithms? Or is it the case the universal matrix identities separate Frege and Extended Frege? This talk will address these questions and present recent progress in this area. 

LAAW04 
11th April 2006 12:00 to 12:30 
A minimal quantified proof system for polytime reasoning  
LAAW04 
11th April 2006 14:00 to 15:00 
R Impagliazzo  Which SAT instances are the hardest?  
LAAW04 
11th April 2006 15:30 to 16:00 
S Riis 
Sporadic propositional proofs The following feature is shared by certain weak propositional proof systems: If $\psi_n$ is a uniformly generated sequence of propositional formula, and the sequence $\psi_n$ has polynomial size proofs, then the sequence $\psi_n$ in fact has polynomial size proofs that are generated in an uniform manner . For stronger propositional proof systems (like the Frege Proof Systems) this feature might fail and sporadic proofs (that are not instances of a sequence of uniform proofs) might exist. In the talk I will argue that understanding and limiting the behaviour of these sporadic proofs could be crucial for any serious progress in Propositional Proof Complexity. 

LAAW04 
11th April 2006 16:00 to 16:30 
Resolution by pebbling games We present an approach to study the complexity of refutations in Resolution by means of certain pebbling games. We will define certain classes of such games that allow to obtain a new frame for: (1) proving size lower bounds in Resolution; (2) devising new algorithms to find Resolution refutations; (3) characterizing the hardness of provability in Resolution as an "almost" satisfiability. We'll discuss about how to extend a similar approach to stronger systems than Resolution. 

LAAW04 
12th April 2006 09:30 to 10:30 
T Pitassi  Using lower bounds in proof complexity  
LAAW04 
12th April 2006 10:30 to 11:00 
Narrow proofs may be spacious: separating space and width in resolution The width of a resolution proof is the maximal number of literals in any clause of the proof. The space of a proof is the maximal number of memory cells used if the proof is only allowed to resolve on clauses kept in memory. Both of these measures have previously been studied and related to the refutation size of unsatisfiable CNF formulas. Also, the resolution refutation space of a formula has been proven to be at least as large as the refutation width, but it has remained unknown whether space can be separated from width or the two measures coincide asymptotically. We prove that there is a family of kCNF formulas for which the refutation width in resolution is constant but the refutation space is nonconstant, thus solving an open problem mentioned in several previous papers. 

LAAW04 
12th April 2006 11:30 to 12:30 
Connections between zeroone programming, communication complexity and proof complexity Some of the most powerful techniques for solving NPcomplete problems are based on integer programming techniques such as the LovaszSchrijver liftandproject method. When used for satisfiability algorithms, such techniques can be viewed as propositional proof systems that prove tautologies (by establishing that the negation of a formula is unsatisfiable), and the proof structure is a successive derivation of lowdegree inequalities over the rationals until a contradiction is found. A major challenge on the frontier of propositional proof complexity is to understand the power of these propositional proof systems (which we call "threshold logics") . At present, it is conceivable that such systems could prove every tautology with derivation cubic in the size of the tautology (note: it is fairly easy to come up with examples that require large refutations by a particular algorithm based on these ILP techniques, however, the proof systems can be viewed as generalizations that nondeterministically make optimal choices at each step to minimize derivation size, so the difference is between proving lower bounds for a particular algorithm and proving lower bounds for all possible algorithms of a certain form). It is conjectured that there are tautologies which require exponential size to prove in these systems but there is yet no proof to confirm this. In this lecture, we survey recent progress in a program towards proving size lower bounds for these logics by proving lower bounds for the multiparty numberontheforehead communication complexity of the setdisjointness function. Our primary contributions are: establishing the reduction from proof size lower bounds to communication lower bounds, and new lower bounds (that fall slightly short of what is needed for a superpolynomial proof size lower bound) for the NOF communication complexity of the set disjointness function. Some of this material will appear in Computational Complexity 2005, some of it is submitted, and some of it is ongoing. Collaborators include Paul Beame, Toniann Pitassi, and Avi Wigderson. 

LAAW04 
13th April 2006 09:30 to 10:30 
The strength of multilinear proofs We discuss an algebraic proof system that manipulates multilinear arithmetic formulas. We show this proof system to be fairly strong even when restricted to multilinear arithmetic formulas of a very small depth. Specifically, algebraic proofs manipulating depth 3 multilinear arithmetic formulas are strictly stronger than Resolution, Polynomial Calculus (and Polynomial Calculus with Resolution); and (over characteristic 0) admit polynomialsize proofs of the (functional) pigeonhole principle. Finally, we illustrate a connection between lower bounds on multilinear proofs and lower bounds on multilinear arithmetic circuits. 

LAAW04 
13th April 2006 10:00 to 10:30 
A complete resolutionlike calculus for Maxsat MaxSAT is the problem of finding an assignment minimizing the number of unsatisfied clauses of a given CNF formula. We propose a resolutionlike calculus for MaxSAT and prove its soundness and completeness. We also prove the completeness of some refinements of this calculus. From the completeness proof we derive an exact algorithm for MaxSAT and a time upper bound. Joint work with Jordi Levy and Felip Manya. 

LAAW04 
13th April 2006 11:30 to 12:30 
Parametrised proof complexity Motivated by the well known concept of FixedParameter Tractability, we initiate research in an area that we call Parametrised Proof Complexity. We first give a CookRekhowstyle definition of a parametrised proof system, and then relate the (im)possibility of existence of a fixedparameter tractable (FPT) proof system to the FPT vs W[2] question. We observe that the most popular method for designing fixedparameter algorithms, Bounded Search Tree, corresponds to Treelike Resolution in the context of parametrised proof systems, and give some upper and lower bounds for the latter. We finally discuss and prove some dichotomy results (also called complexity gap theorems) for weak relativised proof systems. This is a work in progress, joint with B. Martin and S. Szeider. 

LAAW04 
13th April 2006 14:00 to 14:30 
P Naumov  Meta complexity of propositional proofs  
LAAW04 
13th April 2006 14:30 to 15:30 
On the power of LovaszSchrijver hierarchy Lov\'{a}sz and Schrijver described a generic method of tightening the LP and SDP relaxation for any $0$$1$ optimization problem. These tightened relaxations were the basis of several celebrated approximation algorithms (such as for MAXCUT, MAX3SAT, and SPARSEST CUT). In this talk I will survey recent lower bounds for the number of rounds in this model for wellknown problems such as MAXSAT, Hypergraph Vertex Cover and Minimum Set Cover. Also I will elaborate why these lower bounds are encoding dependent, in particular the standard NP reductions do not seem to work in this model. This fact leads to the wide list of open problems, which appear important for proof complexity, computational complexity and algorithm design. 

LAA 
18th April 2006 11:00 to 12:00 
T Brihaye 
Ominimal hybrid systems, bisimulation and control In this talk, in order to study bisimulations on ominimal hybrid systems, we explain how to symbolically encode the trajectories of the systems through words. This method was introduced in our paper "On ominimal hybrid systems" in order to give a new proof of the existence of a finite bisimulation for ominimal hybrid systems (as previously proved in a paper by Lafferriere G., Pappas G.J. and Sastry S. in 2000). Then we will consider control (reachability) problem on ominimal hybrid systems and we will see that the previously discussed word encoding technique can help in order to solve this problem. 

LAA 
20th April 2006 11:00 to 12:00 
Forcing with random variables and complexity of computations and proofs We develop a new construction of models of arithmetic. The models are Booleanvalued and are based on random variables. We outline some applications to bounded arithmetic and to complexity theory. 

LAA 
21st April 2006 11:00 to 12:00 
Team logic I discuss logic based on dependence as a basic concept. What is the logic of dependence? What is the mathematics of dependence? Typical examples of dependence in the sense that I am interested in are the dependence of a field on other fields in a database, the dependence of a move in a strategy on other moves, and the dependence of a quantifier on other quantifiers in formulas of formal or natural languages. I describe a new logic, called team logic, for the study of types of dependence and give it semantics in terms of team structures. I discuss connections to other logics such as linear logic, modal logic, first order logic, independence friendly logic, Henkin quantifier logic, and second order logic. 

LAA 
24th April 2006 11:00 to 12:00 
P Pudlak 
On lower bounds in nonclassical logics Whereas for classical logic it seems to be an extremely difficult problem, for some nonclassical logics it is possible to prove exponential lower bounds on the lengths of proofs in (system based on axioms and derivation rules). For some modal logics recently such unconditional lower bounds have been proved. For intuitionistic logic we have only results based on unproven assumptions. 

LAA 
25th April 2006 11:00 to 12:00 
Graph searching games and graph decompositions I  
LAA 
26th April 2006 11:00 to 12:00 
Complexity gaps for resolutionbased proof systems  
LAA 
27th April 2006 11:00 to 12:00 
Graph searching games and graph decompositions II  
LAA 
2nd May 2006 11:00 to 12:00 
Graph searching games and graph decompositions III  
LAA 
3rd May 2006 11:00 to 12:00 
K Kulesza 
Observations on inverting the VMPC oneway function Informally speaking, oneway functions are functions for which it is "easy" to compute their values from their arguments but it is "computationally infeasible" to reverse them i.e. to find their arguments knowing their values. A rigorous definition of the terms "easy" and "computationally infeasible" is necessary but would detract from the simple idea that is being conveyed. Existence of oneway functions is only conjectured and closely connected with Cooks hypothesis. Roughly speaking, if P is not equal NP such functions should exist. Apart from theoretical importance, oneway functions are fundamental for complexity based cryptography. Problem is being attacked in many ways and there are several instances which are perceived to be good candidates, for instance factorisation or discreet logarithm. There are also practical reasons to search for new candidates. We investigate the possibilities of inverting the VMPC oneway function, which was proposed at Fast Software Encryption 2004. (VMPC stands for Variably Modified Permutation Composition). First, we describe the function using the language of permutation theory. Next, easily invertible instances of VMPC are derived. We also show that no VMPC function is onetoone. Implications of these results for cryptographic applications of VMPC conclude the presentation. 

LAA 
4th May 2006 11:00 to 12:00 
Graph searching games and graph decompositions IV  
LAA 
5th May 2006 11:00 to 12:00 
Confluent Markov chains We consider infinitestate discrete Markov chains which are confluent: each computation will almost certainly either reach a defined set F of final states, or reach a state from which F is not reachable. Confluent Markov chains include probabilistic extensions of several classical computation models such as Petri nets, Turing Machines, and communicating finitestate machines. For confluent Markov chains, we consider three different variants of the reachability problem and the repeated reachability problem: The qualitative problem, i.e., deciding if the probability is one (or zero); the approximate quantitative problem, i.e., computing the probability upto arbitrary precision; and the exact quantitative problem, i.e., computing probabilities exactly. We also study the problem of computing the expected reward (or cost) of runs until reaching the final states, where rewards are assigned to individual runs by computable reward functions. 

LAAW05 
8th May 2006 10:00 to 11:00 
Z Manna  From verification conditions to constraints  
LAAW05 
8th May 2006 11:30 to 12:00 
Solving verification constraint problems with constraint programming  
LAAW05 
8th May 2006 14:00 to 15:00 
Program verification by parametric abstraction and semidefinite programming  
LAAW05 
8th May 2006 15:30 to 16:00 
Parameterized interfaces for open system verification of product lines  
LAAW05 
8th May 2006 16:00 to 16:30 
Polyhedral analysis of systems software  
LAAW05 
8th May 2006 16:30 to 17:00 
Proving termination of programs  
LAAW05 
9th May 2006 09:00 to 10:00 
Bounded and unbounded model checking with SAT  
LAAW05 
9th May 2006 10:00 to 10:30 
D Kroening  Model checking C++ programs that use the STL  
LAAW05 
9th May 2006 10:30 to 11:00 
K Namjoshi  Incremental model checking  
LAAW05 
9th May 2006 11:30 to 12:00 
J MarquesSilva  Towards more efficient SATbased model checking  
LAAW05 
9th May 2006 12:00 to 12:30 
Uniform + supercompilation = verification  
LAAW05 
9th May 2006 14:00 to 15:00 
Artifical Biochemistry  
LAAW05 
9th May 2006 15:30 to 16:00 
Generalizing BDD trees using minimal and/or graphs  
LAAW05 
9th May 2006 16:00 to 16:30 
Return of the JTMS: Preferences orchestrate conflict learning and solution synthesis  
LAAW05 
9th May 2006 16:30 to 17:00 
Enhancing software model checking with static program analysis  
LAAW05 
9th May 2006 17:00 to 17:30 
M Leconte  State of the art in constraint programming solvers  
LAAW05 
10th May 2006 09:00 to 10:00 
Logic verification challenges in system level design at Intel  
LAAW05 
10th May 2006 10:00 to 10:30 
Constraints in hardware verification: some industrial perspectives  
LAAW05 
10th May 2006 10:30 to 11:00 
D Lugiez  Symbolic constraint solving for cryptographic protocols  
LAAW05 
10th May 2006 11:30 to 12:00 
Automatic refinement and vacuity detection for symbolic trajectory evaluation  
LAAW05 
11th May 2006 09:00 to 10:00 
Linear constraints in probabilistic model checking  
LAAW05 
11th May 2006 10:00 to 10:30 
Time average games  
LAAW05 
11th May 2006 10:30 to 11:00 
O Maler  Controller synthesis with adversaries  
LAAW05 
11th May 2006 11:30 to 12:00 
P Abdullah  Language inclusion for timed automata  
LAAW05 
11th May 2006 14:00 to 15:00 
Consequence generation, interpolants and invariant discovery  
LAAW05 
11th May 2006 15:30 to 16:30 
Y Naveh  Constraint satisfaction for random stimuli generation  
LAAW05 
11th May 2006 16:30 to 17:00 
Verifying properties of wellfounded linked lists  
LAAW05 
11th May 2006 17:00 to 17:30 
Reasoning about dynamic networks of counter systems  
LAAW05 
12th May 2006 09:00 to 10:00 
On the use of automata for representing arithmetic constraints  
LAAW05 
12th May 2006 10:00 to 10:30 
Modelling with FO(ID); Solving with SAT  
LAAW05 
12th May 2006 10:30 to 11:00 
S Cotton  Satisfiability modulo theory chains with DPLL(T)  
LAAW05 
12th May 2006 11:30 to 12:00 
A progressive approach to solving satisfiability modulo theories (SMT)  
LAAW05 
12th May 2006 14:00 to 15:00 
E Giunchiglia  State of the art in QBF reasoning, with emphasis on applications in FV  
LAAW05 
12th May 2006 15:30 to 16:30 
Compact propositional encodings of firstorder theories  
LAA 
15th May 2006 11:00 to 12:00 
Connecting logic and learning Many complex domains offer limited information about their exact state and the way actions affect them. For example, a robot exploring a building does not know how pressing a button affects the world, and also may not see the effect of pressing the button immediately. There, we need to learn action models to act effectively, at the same time that we track the (partially observed) state of the domain. In this presentation I will describe polynomialtime algorithms for learning logical models of actions' effects and preconditions in deterministic partially observable domains. These algorithms represent the set of possible action models compactly, and update it after every action execution and partial observation. This approach is the first tractable learning algorithm for partially observable dynamic domains. I will mention recent extensions of this work to relational domains, and will also discuss potential applications of this work to agents playing adventure games and to formal verification. 

LAA 
16th May 2006 11:00 to 12:00 
Don't care words with an application to the automatabased approach for real addition As already pointed out by Buechi, automata can be used to decide certain firstorder theories of linear arithmetic, like the firstorder theory over the reals with addition and the ordering. Reals are represented as omegawords (e.g., using the 2's complement representation), and the automata are recursively constructed from the formulas. The constructed automaton for a formula precisely accepts the words representing the reals that make the formula true. This automatabased approach for representing and manipulating sets of reals has also applications in infinite state model checking. However, analogous to BDDs to represent finite sets, the sizes of the automata are an obstacle in the automatabased set representation. In this talk, I will show that socalled "don't cares sets" for BDDs can be generalized to word languages as a means to reduce the automata sizes and how "don't care words" can be used to build a more effective automatabased decision procedure for real addition. A general result about don't care words is that the minimal weak deterministic Buechi automaton with respect to a given don't care set with certain restrictions is uniquely determined and can be constructed efficiently. 

LAA 
17th May 2006 11:00 to 12:00 
Model checking for probabilistic realtime systems Probabilistic timed automata are timed automata extended with discrete probability distributions, and can be used to model timed randomised protocols or faulttolerant systems. This talk will give an overview of modelchecking algorithms for probabilistic timed automata and their prototype implementations. Firstly, we consider a subclass of probabilistic timed automata and show that the classical approach based on an integral model of time (digital clocks) admits verification of an important class of properties including probabilistic reachability and expected reachability. Next we present a complete symbolic framework for the verification of probabilistic timed automata against the quantitative probabilistic, timed temporal logic PTCTL. The algorithms operate on zones, that is, sets of valuations of the probabilistic timed automaton's clocks, and therefore avoid an explicit construction of the state space. The digital clocks approach enables modelling and quantitative analysis of protocols directly within the probabilistic model checker PRISM (www.cs.bham.ac.uk/~dxp/prism/). The symbolic approach necessitates combining data structures for realtime verification with those for symbolic probabilistic model checking. Both have been applied to analyse performance of realworld randomised protocols such as the dynamic configuration protocol for IPv4 linklocal addresses and the FireWire root contention protocol case study. 

LAA 
18th May 2006 11:00 to 12:00 
Solving games without determinization The synthesis of reactive systems requires the solution of twoplayer games on graphs with omegaregular objectives. When the objective is specified by a linear temporal logic formula or nondeterministic Buechi automaton, then previous algorithms for solving the game require the construction of an equivalent deterministic automaton. However, determinization for automata on infinite words is extremely complicated, and current implementations fail to produce deterministic automata even for relatively small inputs. We show how to construct, from a given nondeterministic Buechi automaton, an equivalent nondeterministic parity automaton N that is good for solving games with objective N. The main insight is that a nondeterministic automaton is good for solving games if it fairly simulates the equivalent deterministic automaton. In this way, we omit the determinization step in game solving and reactive synthesis. The fact that our automata are nondeterministic makes them surprisingly simple, amenable to symbolic implementation, and allows an incremental search for winning strategies. 

LAA 
23rd May 2006 11:00 to 12:00 
Automatabased techniques for the analysis of dynamic concurrent programs We propose formal models for dynamic programs with procedure calls and dynamic creation of concurrent processes. The considered models are based on (combinations of) various classes of rewrite systems such as multiset rewrite systems and prefix rewrite systems. We address the reachability analysis problem for such models. More precisely, we consider the problem of of computing finite representations of the (potentially infinite) sets of reachable configurations in such models. We consider finite word/tree automata for the representation of these reachability sets. The proposed constructions can be used for solving modelchecking problems for the considered classes of infinitestate models. 

LAA 
25th May 2006 11:00 to 12:00 
A physical analysis of mechanical computability Turing's model of autonomous computation is based on the idealized psychological characteristics of a human calculator  its ability to change its "state of mind" in a stepwise fashion based on the recognition of symbolic configurations. This leads to a mathematical model of effective computability, with its well known capabilities and limitations. We extend this analysis to the idealized physiological characteristics of a machine computer  its ability to manipulate matter using energy. This leads to a new notion of feasibility based on physical resource bounds, in which mass and energy constraints further restrict the usual notions of space and time complexity. 

LAA 
30th May 2006 11:00 to 12:00 
Post's lattice with applications to complexity theory (Part I) In this sequence of lectures we will give a basic introduction to Post's lattice of all sets of Boolean functions closed under superposition (a.k.a. clones), and show how to make use of this structure to obtain complexiy classifications of diverse problems for Boolean circuits, propositional formulas, databases, and Boolean constraint satisfaction problems. 

LAA 
1st June 2006 11:00 to 12:00 
Post's lattice with applications to complexity theory (Part II) In this sequence of lectures we will give a basic introduction to Post's lattice of all sets of Boolean functions closed under superposition (a.k.a. clones), and show how to make use of this structure to obtain complexiy classifications of diverse problems for Boolean circuits, propositional formulas, databases, and Boolean constraint satisfaction problems. 

LAA 
5th June 2006 11:00 to 12:00 
Post's lattice with applications to complexity theory (Part III) In this sequence of lectures we will give a basic introduction to Post's lattice of all sets of Boolean functions closed under superposition (a.k.a. clones), and show how to make use of this structure to obtain complexiy classifications of diverse problems for Boolean circuits, propositional formulas, databases, and Boolean constraint satisfaction problems. 

LAA 
6th June 2006 11:00 to 12:00 
And logic begat computer science: When giants roamed the Earth During the past fifty years there has been extensive, continuous, and growing interaction between logic and computer science. In fact, logic has been called "the calculus of computer science". The argument is that logic plays a fundamental role in computer science, similar to that played by calculus in the physical sciences and traditional engineering disciplines. Indeed, logic plays an important role in areas of computer science as disparate as architecture (logic gates), software engineering (specification and verification), programming languages (semantics, logic programming), databases (relational algebra and SQL), artificial intelligence (automated theorem proving), algorithms (complexity and expressiveness), and theory of computation (general notions of computability). This nontechnical talk will provide an overview of the unusual effectiveness of logic in computer science by surveying the history of logic in computer science, going back all the way to Aristotle and Euclid, and showing how logic actually gave rise to computer science. 

LAA 
8th June 2006 11:00 to 12:00 
Choiceless polynomial time We will introduce the choiceless polynomial time (CPT) model of computation, due to Blass, Gurevich and Shelah. With the addition of a counting operator, CPT is known to be more powerful than inflationary fixedpoint logic with counting (IFP+C) while still being contained in polynomial time. Blass et al give a number of candidate problems for separating CPT with counting from P, including the query due to Cai, Fuerer and Immerman that separates IFP+C from P. We show that this particular query is definable in CPT, even without counting. 

LAA 
12th June 2006 11:00 to 12:00 
Model theory on wellbehaved finite structures The early days of finite model theory saw a variety of results establishing that the model theory of the class of finite structures is not wellbehaved. Recent work has shown that considering subclasses of the class of finite structures allows us to recover some good modeltheoretic behaviour. This appears to be especially true of some classes that are known to be algorithmically wellbehaved. I will review some results in this area and explore the connection between logic and algorithms. 

LAA 
13th June 2006 11:00 to 12:00 
Analysis of recursive Markov chains, recursive Markov decision processes and recursive stochastic games (Part I)  
LAA 
14th June 2006 11:00 to 12:00 
Environment abstraction for parameterised systems In this talk, we describe environment abstraction, a new form of abstraction for parameterized systems. Characteristically, environment abstraction views a concurrent system from the point of view of a single process. We argue that for systems designed by human programmers this Ptolemaic view of the world yields precise and feasible abstract models. We survey examples of distributed algorithms and cache coherence protocols which were successfully verified by environment abstraction. Joint Work with Ed Clarke and Muralidhar Talupur 

LAA 
15th June 2006 11:00 to 12:00 
Analysis of recursive Markov chains, recursive Markov decision processes and recursive stochastic games (Part II)  
LAA 
20th June 2006 11:00 to 12:00 
Analysis of recursive Markov chains, recursive Markov decision processes and recursive stochastic games. Part III.  
LAA 
22nd June 2006 11:00 to 12:00 
On the complexity of infinite computations Finitestate automata constitute the very basic level in classical complexity theory. In contrast, when applied to infinite words and trees, they reveal a remarkable definability power, going beyond the Borel hierarchy. Still, some good properties of finite automata remain, like decidability of normal forms. The talk will compare various complexity hierarchies with the emphasis on decidability questions, and present some recent results obtained by the author in collaboration with Igor Walukiewicz and Filip Murlak. For deterministic automata on infinite trees, the situation is by now clarified; both the automataindex hierarchy and the Wadge hierarchy are decidable. In contrast, the nondeterministic case challenges for new techniques and ideas. We will present some modest step in this direction, showing that the socalled game languages (essentially nondeterministic) form a hierarchy w.r.t. the Wadge reducibility. 

LAA 
23rd June 2006 11:00 to 12:00 
A deterministic subexponential algorithm for solving parity games The existence of polynomial time algorithms for the solution of parity games is a major open problem. The fastest known algorithms for the problem are randomized algorithms that run in subexponential time. Randomness seems to play an essential role in these algorithms. We use a completely different, and elementary, approach to obtain a deterministic subexponential algorithm for the solution of parity games. Our deterministic algorithm is almost as fast as the randomized algorithms mentioned above. This is joint work with Mike Paterson and Uri Zwick. 

LAA 
26th June 2006 11:00 to 12:00 
Modelling in logic; solving with SAT he seminal theorem of Fagin states that existential secondorder logic (ESO) captures NP. This and other results in descriptive complexity suggest that logics could be viewed as programming languages for their corresponding complexity classes. Despite the conceptual attractiveness, general application of this idea remains far from practical. We describe a project which can be seen as using the idea to produce practical tools for modelling and solving search problems. A problem instance is a finite structure, and a problem specification is a formula defining the relationship between an instance and its solutions. Solving a problem amounts to expanding the structure with new relations (witnessing ESO quantifiers), to satisfy the formula. Finding these is reduced to SAT, or a suitable extension, to take advantage of recent remarkable progress in solver technology. We discuss some progress and problems to date. 

LAA 
27th June 2006 11:00 to 12:00 
Patterns of compositional reasoning Compositional Reasoning  reducing reasoning about a concurrent system to reasoning about its individual components  is an essential tool for managing state explosion in model checking. Typically, such reasoning is carried out in an assumeguarantee manner: each component guarantees its behavior based on assumptions about the behavior of other components. Restrictions imposed on such methods to avoid unsoundness may also result in incompleteness  that is, one is unable to prove certain properties. We construct a general framework for reasoning about process composition, formulate an assumeguarantee method, and show that it is sound and semantically complete. We then show how to instantiate the framework for common notions of process behavior and composition. For these notions, the instantiations result in complete, mutually inductive, assumeguarantee reasoning methods. Joint work with Nina Amla, E. Allen Emerson, and Kedar Namjoshi. 

LAA 
28th June 2006 11:00 to 12:00 
A fresh look at testing for asynchronous communication Testing is one of the fundamental techniques for verifying if a computing system conforms to its specification. Developing an efficient and exhaustive test suite is a challenging problem, especially in the setting of distributed systems. We take a fresh look at the theory of testing for messagepassing systems based on a natural notion of observability in terms of inputoutput relations. We propose two notions of test equivalence: one which corresponds to presenting all test inputs up front and the other which corresponds to interactively feeding inputs to the system under test. We compare our notions with those studied earlier, notably the equivalence proposed by Tretmans. In Tretmans' framework, asynchrony is modelled using synchronous communication by augmenting the state space of the system with queues. We show that the first equivalence we consider is strictly weaker than Tretmans' equivalence and undecidable, whereas the second notion is incomparable. We also establish (un)decidability results for these equivalences. This is a joint work with Puneet Bhateja and Madhavan Mukund from the Chennai Mathematical Institute 

LAA 
29th June 2006 11:00 to 12:00 
Models of biological regulatory networks The genome is a selfregulating network that controls the biochemical reactions in a cell. Biologists have known this for almost half a century, but only recently has systemslevel genomics become a major undertaking in their discipline. Research in this area takes many forms, ranging from characterizations of general classes of networks to methodologies for accurately describing and modeling specific examples of genomic nets. This lecture will outline and give examples of some of these efforts. 

LAAW06 
3rd July 2006 09:00 to 10:25 
Dynamicepistemic logic of games Games involve actions, knowledge, beliefs, strategies, preferences, and all of these can change interactively as moves are being played. We will show how such phenomena can be modeled in current dynamic logics of information update, belief change, and preference upgrade. This perspective sits at the interface of philosophical logic, computational logic, and game theory. We give some illustrations to communication, and solution procedures for games, and we mention a number of open problems in the area, both conceptual and technical. Related Links


LAAW06 
3rd July 2006 10:30 to 10:45 
Game quantification on automatic structures and hierarchical games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 11:30 to 12:00 
Towards the Wadge hierarchy of weak alternating tree automata
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 12:00 to 12:30 
Are parity games spineless?
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 14:00 to 14:30 
T Colcombet 
Ramseyan factorisation for trees
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 14:30 to 14:45 
M Samuelides 
Pebble treewalking automata
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 14:45 to 15:00 
A hierachy of automatically presentable omegawords having a decidable MSO theory
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 15:30 to 16:00 
On distributed synthesis of discrete systems
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 16:00 to 16:30 
Solving fixpoint equations in omegacontinuous semirings: Some ideas and many questions
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
3rd July 2006 16:30 to 17:00 
On size versus number of variables
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
4th July 2006 09:00 to 10:25 
Mechanism design
The field of economic mechanism design has been an active area of research in economics for many years. Recently many researchers in computer Science and Artificial Intelligence have been contributed to this area. This field uses tools of game theory to design rules of interaction for economic transactions that will yield some desired outcome such as maximizing social welfare or maximizing revenue. In this short tutorial I provide an overview of this subject for an audience with a limited knowledge of game theory.


LAAW06 
4th July 2006 10:30 to 11:00 
Stable partitions in coalition games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
4th July 2006 11:30 to 12:00 
From discounting to parity games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
4th July 2006 12:00 to 12:30 
H Gimbert 
Positional stochastic games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
4th July 2006 14:00 to 14:30 
Recursive concurrent stochastic games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
4th July 2006 14:30 to 15:00 
Backwards induction for games of infinite horizon
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
4th July 2006 15:30 to 16:55 
Hypertree decompositions One of the bestknown methods for decomposing graphs is the method of treedecompositions introduced by Robertson and Seymour. Many NPhard problems become polynomially soblvable if restricted to instances whose underlying graph structure has bounded treewidth. The notion of treewidth can be straightforwardly extended to hypergraphs by simply considering the treewidth of their primal graphs or, alteratively, of their incidence graphs. However, doing so comes along with a loss of information on the structure of a hypergraph with the effect that many polynomially solvable problems cannot be recognized as such because the treewidth of the underlying hypergraphs is unbounded. In particular, the treewidth of the class of acyclic hypergraphs is unbounded. In this talk, I will describe more appropriate measures for hypergraph acyclicity, and, in particular, the method of hypertree decompositions and the associated concept of hypertree width. After giving general results on hypertree decompositions, I will report on gametheoretic characterizations of hypergraph decomposition methods, and give a survey on recent results. Related Links


LAAW06 
5th July 2006 09:00 to 10:25 
Games as an algorithmic construct Games were introduced into complexity theory by Chandra, Kozen, and Stockmeyer (JACM, 198), through the concept of alternation, which generalizes nondeterminism. The standard use of alternation is in complexitytheoretic settings. The focus of this talk is on presenting games and alternation as a powerful algorithmic construct. The driving examples are various automatedverification tasks, where games yield elegant and optimal algorithms. Related Links


LAAW06 
5th July 2006 10:30 to 11:00 
Y Venema 
Coalgebra automata
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
5th July 2006 11:30 to 12:00 
S Froschle 
When is secrecy decidable?
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
5th July 2006 12:00 to 12:15 
Finitary parity and streett games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
5th July 2006 12:15 to 12:30 
S Lasota 
Faster algorithm for bisimulation equivalence of normed contextfree processes
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 09:00 to 10:25 
Nested words and trees This talk is centered around our recent model of nested words as a representation of data with both a linear ordering and a hierarchically nested matching of items. Such dual structure occurs in diverse corners of computer science ranging from executions of structured programs where there is a natural wellnested matching of entries to and exits from functions and procedures, to XML documents with the wellnested structure given by starttags matched with endtags. Finitestate acceptors for nested words define the class of regular languages of nested words that has all the appealing theoretical properties that the class of classical regular word languages enjoys: deterministic nested word automata are as expressive as their nondeterministic counterparts; the class is closed under union, intersection, complementation, concatenation, and Kleene*; membership, emptiness, language inclusion, and language equivalence are all decidable; and definability in monadic second order logic corresponds exactly to finitestate recognizability. We argue that for algorithmic lineartime verification of pushdown systems, instead of viewing the program as a contextfree language over words, one should view it as a regular language of nested words, and this would allow model checking of many properties (such as stack inspection, prepost conditions) that are not expressible in existing specification logics. We also discuss nested trees that are obtained adding nesting edges along paths of trees, and finitestate acceptors over nested trees. This leads to a very expressive fixpoint logic (equivalently, alteranting parity nested tree automata) with a decidable model checking problem for pushdown systems. 

LAAW06 
6th July 2006 10:30 to 11:00 
Game semantics and automata
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 11:30 to 12:00 
Verifying concurrent messagepassing C programs with recursive calls
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 12:00 to 12:15 
K Rozier 
Algorithms for automatatheoretic linear temporal logic model checking
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 12:15 to 12:30 
Experimental evaluation of complementation of nondeterministic Buechi automata
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 14:00 to 14:30 
Optimality equations and strategy improvement for average payoff games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 14:30 to 14:45 
A Trivedi 
A strategy improvement algorithm for optimal time reachability games
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 14:45 to 15:00 
Communicating timed automata
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 15:30 to 16:00 
Proving liveness by backwards reachability
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
6th July 2006 16:00 to 16:30 
Model checking games for fixpoint logic with Chop
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
7th July 2006 09:00 to 10:25 
Deterministic grammars We purpose a survey on various deterministic grammars (alphabetic systems) used to define infinite words, terms, trees and graphs. The deterministic grammars over words are the morphisms which are used to generate the morphic words. The labelled deterministic grammars over words coincide with the simple (contextfree) grammars or the monadic recursive schemes, and can be used to produce simple infinite trees and graphs. The deterministic grammars over terms are the recursive program schemes, and define the algebraic terms. The deterministic grammars over typed terms are used to generate a hierarchy of infinite terms and can be used to generate a hierarchy of higher order morphic words. The deterministic grammars over (finite) graphs generate the transition graphs of pushdown automata. We will present these grammars with general known results. Then we will give new results with open questions and possible extensions. 

LAAW06 
7th July 2006 10:30 to 11:00 
R Lazic 
On LTL with the freeze quantifier and register automata
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
7th July 2006 11:30 to 12:00 
Are oneplayer games always simple?
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html


LAAW06 
7th July 2006 12:00 to 12:30 
Tree algebras
http://www.games.rwthaachen.de/History/PastEvents/cambridge06.html
