CRECOGI+ELICA+GDRILL Pleanary Meeting: October 8-11th 2018
A joing plenary meeting of the three research project CRECOGI,
ELICA, and
GDRILL will
take place in Paris (France) on Ocotber 8-11th, 2018. There is quite a large overlap
between the projects, and this will be a nice occasion to share our results and discuss
about ongoing work.
Program
Monday, October 8th, room 3052
- 9.50: Welcome
- 10.00-10.45: First Session
- Shin-ya Katsumata.
Fibrational Bisimulations and Quantitative Reasoning.
Bisimulation and bisimilarity are fundamental notions in comparing state-based systems. Their extensions to a variety of systems have been actively pursued in recent years, a notable direction being quantitative extensions. In this paper we present an abstract categorical framework for such extended (bi)simulation notions. We use coalgebras as system models and fibrations for organizing predicates following the seminal work by Hermida and Jacobs, but our focus is on the structural aspect of fibrational frameworks. Specifically we use morphisms of fibrations as well as canonical liftings of functors via Kan extensions. We apply this categorical framework by deriving some known properties of the Hausdorff pseudometric and approximate bisimulation in control theory.
- 10.45-11.15: Coffee Break
- 11.15-13.00: Second Session
- 11.15-12.00: Koko Muroya.
Token-passing semantics with and without rewriting.
When applied to program semantics, GoI can provide execution models of functional programs, so-called "token-passing" semantics. A program is represented as a diagram, such as a proof net or a lambda-graph, and its evaluation result is computed by passing an agent, the token, around the diagram. This semantics is without rewriting, where a diagram is fixed during evaluation. Reduction semantics of conventional functional programs can be revisited as token-passing semantics with rewriting. Instead of rewriting a term following an evaluation strategy, we can rewrite its diagrammatic representation, in such a way that token passing takes control of rewriting. This talk gives an overview of our recent work using token-passing semantics with and without rewriting: the disciplined combination of token passing and diagram rewriting, where selective rewriting is vital. The semantics can naturally model unconventional but increasingly significant paradigms of programming, namely programming with "computation graphs". Computation graphs are used to represent a machine-learning model, dependency between modifiable data, etc. A program not only calculates a value, but also constructs a computation graph, manipulates it, and observes it to yield a value. The combination of token passing and selective diagram rewriting is capable of modelling evaluation of the program; from the coexistence of values and computation graphs, to some manipulation of computation graphs.
[slides]
- 12.15-13.00: Léo Stefanesco.
An Asynchronous Soundness Theorem for Concurrent Separation Logic.
Concurrent separation logic (CSL) is a specification logic for concurrent imperative programs with shared memory and locks. In this paper, we develop a concurrent and interactive account of the logic inspired by asynchronous game semantics. To every program C, we associate a pair of asynchronous transition systems ⟦C⟧S and ⟦C⟧L which describe the operational behavior of the Code when confronted to its Environment or Frame – both at the level of machine states (S) and of machine instructions and locks (L). We then establish that every derivation tree π of a judgment Γ⊢{P}C{Q} defines a winning and asynchronous strategy ⟦π⟧Sep with respect to both asynchronous semantics ⟦C⟧S and ⟦C⟧L. From this, we deduce an asynchronous soundness theorem for CSL, which states that the canonical map ℒ:⟦C⟧S→⟦C⟧L, from the stateful semantics ⟦C⟧S to the stateless semantics ⟦C⟧L satisfies a basic fibrational property. We advocate that this provides a clean and conceptual explanation for the usual soundness theorem of CSL, including the absence of data races.
[slides]
- 13.00-15.00: Lunch (3rd Floor) and Free Discussion
- 15.00-16.45: Third Session
- 15.00-15.45: Kanako Kobayashi.
Co-lifting of Heyting algebras and its S4 analogue.
As is well-known, the intuitionistic logic can be embedded into the modal logic S4 via Girard translation. This can be extended to a correspondence between intermediate logics and normal extensions of S4. From the algebraic point of view, it can be considered as (co-)Kleisli construction of algebras, which transforms a given S4 algebra into a Heyting algebra. We are interested in this correspondence between S4 algebras and Heyting algebras. In this talk, we will focus on two constructions of Heyting algebras: co-lifting and quotient modulo the monolith. They are used by Wronski (1973) to construct uncountably many intermediate logics with the disjunction property. We will give a categorical refinement of these two constructions, and propose their S4 analogue. This S4 counterpart and the original construction of Heyting algebras commute with the Kleisli construction. As an application, we obtain new extensions of S4 with the disjunction property.
[slides]
- 16.00-16.45: Gabriele Vanoni.
On Randomised Strategies in the lambda-calculus.
In this work we study randomised reduction strategies - a notion already known in the context of abstract reduction systems - for the lambda-calculus. We develop a simple framework that allows us to prove a randomised strategy to be positive almost-surely normalising. Then we propose a simple example of randomised strategy for the lambda-calculus that has such a property and we show why it is non-trivial with respect to classical deterministic strategies such as leftmost-outermost or rightmost-innermost. We conclude studying this strategy for two classical sub-lambda-calculi, namely those where duplication and erasure are syntactically forbidden.
[slides]
- 16.45-18.00: Free Discussion
Tuesday, October 9th, room 3052
- 9.30-10.15: First Session
- Paolo Pistone.
On proof equivalence in Second Order Multiplicative Linear Logic.
Proof nets provide permutation-independent representations of proofs and are a natural tool to investigate proof equivalence in fragments of Linear Logic. However, for second order fragments proof nets do not capture all equivalences which hold in parametric models and which play a fundamental role in the second order representation of categorical data types. We consider the problem of characterizing such equivalences from a syntactic and from a semantic viewpoint. First, we present a compact representation of MLL2 proof nets (called "existential linkings") inspired from the "rewiring approach" used for the characterization of the free *-autonomous category. Existential linkings capture, for a fragment of MLL2 related to the Yoneda isomorphism, the equivalence over proofs generated by the so-called dinatural interpretation, a well-known approach to parametric models. Then we present a second order extension of the relational model of MLL which seems to characterize observational equivalence of MLL2 proof nets. This model arises from the analysis of cut-elimination in MLL2 and can be related to dinaturality and parametricity.
[slides]
- 10.15-10.45: Coffee Break
- 10.45-12.30: Second Session
- 10.45-11.00: Paul-André Melliès.
Template games
Game semantics is the art of interpreting formulas (types) as games and proofs (programs) as strategies interacting in space and time with their environment. In order to reflect the interactive behaviour of programs, strategies are required to follow specific scheduling policies. Typically, in the case of a purely sequential programming language, the program (Player) and its environment (Opponent) play one after the other, in a strictly alternating way. In the case of a concurrent language, on the other hand, Player and Opponent are allowed to play several moves in a row, in a non alternating way. In the two cases, the scheduling policy is designed very carefully in order to ensure that the strategies synchronise properly and compose well when plugged together. A longstanding conceptual problem has been to understand when and why a given scheduling policy works and is compositional in that sense. In this talk, I will introduce the notion of template game and exhibit a number of simple and fundamental combinatorial properties which ensure that a given scheduling policy defines a monoidal closed bicategory of games, strategies and simulations. The general framework of template game will be illustrated by constructing two game models of (differential) linear logic with different flavours (alternating and non alternating) using the same categorical combinatorics, performed in the category of small categories.
: a 2-categorical model of differential linear logic.
- 11.45-12.30: Anupam Das.
Non-wellfounded proof theory for Kleene algebra and extensions.
In this talk I will review a recent line of work on the proof theory of Kleene algebra (KA) and its extension by meets and residuals (AL). In this line of work there have been three notable accomplishments:
(1) A regular cut-free system complete for KA that exhibits optimal proof-search complexity (PSPACE). (Das & Pous '17).
(2) A cut-elimination procedure for a non-wellfounded sequent calculus for AL, with regularity/irregularity characterising all models/*-continuous models resp. (Das & Pous '18).
(3) Alternative proofs of (left-handed) completeness for KA, via (1). (Das, Doumane & Pous '18)
In particular, I will focus on (2), showing how natural ideas arising from computational interpretations of proofs may be used to recover invariants of the cut-elimination process to prove convergence.
- 12.30-15.00: Lunch (3rd Floor) and Free Discussion
- 15.00-16.45: Third Session
- 15.00-15.45: Jean-Baptiste Joinet.
Definition by abstraction revisited
In the last decades of XIXe century the members of the italian school of Logic (Peano, Burali-Forti, Padoa, Vailati …) spared a lot of efforts to understand the epistemological status of "Definitions by abstraction" (a kind of definitions increasingly used in the mathematical practice of XIXe century): they characterized reflexive, symmetric and transitive relations R (equivalence relations) as being such that they allow to introduce a fresh operator f such that f(x)=f(y) iff xRy.
Frege and Russell then saw definitions by abstraction as tools able to create new higher order entities : the partition that they induce of the set on which they are defined (if any : we are before Russell's paradox and ZF) were indeed seen as inducing, at the next order, separated new individualities (disjoint non empty subsets covering the original set).
This way to relate abstract ontology and equivalence relations has so to speak two parmenidian characteristics: individualities are considered independently of time (time-independance : first point); the emergence of abstract individuals is produced from some kind of “sameness criterion” (sameness based abstraction : second point).
Denotational semantics of programs, the scientific programme launched by Christopher Strachey and Dana Scott around 1969 can be seen as the first step made to faces the first point (hence promoting "individualities in time"): in a computational universe, transformation comes first and separation phenomenas are generated by (thus must be compatible with) these transformations.
That research programme early propagated to proof-theory (under the name "Denotational semantics of proofs") through the "proofs-as-programs" paradigm (generalizing the Curry-Howard isomorphism between the normalization of Natural deduction for minimal logic and the programs execution in Simply typed Lambda-calculus) completed by the "formulas-as-types" paradigm (formulas as sets of programs having a common behavior, characterized in terms of closure by bi-orthogonal, for a dynamic-dependant orthogonality relation).
In my talk, I will present a complementary but related research programme promoting an "agonal viewpoint" on abstraction (replacing the sameness based viewpoint). This programme starts by introducing a generalization of equivalence relations: collusions (total and surjective collusive relations).
After having 1/ presented how one recovers equivalence relations as a particular case of collusions, 2/ recalled the main operations under which collusions are closed, 3/ proved the main results about collusions concerning the partitioning purpose (namely that collusions characterize partitioning and that the partitions they generate can be rephrased in terms of coalitions), 4/ reformulated the principle of "definitions by abstraction" in order to cope with collusions, I will present the recent results obtained jointly with Thomas Seiller.
5/ A first one is a re-characterization of collusions in terms of bijections between partitions. 6/ A second one investigates the Type based approach of collusions through the orthogonality discipline induced by collusions and characterize the structure of the set of types induced by collusions. 7/ A third one, whenever times allows, constructs a standard way to extend a collusion (a set of types i.e. of coalitions) into another one (extension founded on the idea of natural allies), whenever a compatibility condition is satisfied.
: Collusions, Orthogonality and Types.
- 16.00-16.45: Naohiko Hoshino.
Partial traces and additive categories.
- 16.45-18.00: Free Discussion
Wednesday, October 10th, room 3052
- 9.30-10.15: First Session
- Kazushige Terui.
Proof nets and boolean circuits revisited.
MLL is poor, but not starvingly so. In fact, (T. 2004) argues that MLL proof nets, in contrast to boolean circuits, have (finite) "higher order" facilities so that certain nontrivial boolean gates, such as majority and mod-p, can be smoothly encoded by *constant-depth* proof nets. This observation has led to an ICC result: the constant-depth families of MLL proof nets precisely capture the class L/poly (Mazza and T. 2015).
In this talk, we address the following problems.
1. The cost of expressivity is use of type substitutions. Let
B:= (X tensor X) -o (X tensor X)
be the type for booleans. When encoding a boolean function by a proof net, one has to substitute an MLL type for X. This is absolutely necessary, but not quite pleasant from an aesthetic point of view. Using second order quantifiers is not a good idea, since one would then lose the notion of depth. Alternatively, we consider the monoid of type substitutions acting on the syntactic category of MLL types and proof nets, that can be formalised as a graded comonad, leading to a Kleisli-like construction (though not exactly so). This allows us to hide away ugly substitutions while keeping the notion of depth.
2. A simple quantum extension of MLL proof nets was proposed by Dal Lago, Faggian, Hasuo and Yoshimizu in (2011, 2014), and having been developed since then. The idea is to introduce
a modal box or a type constant to encode a qubit. Quantum computation can then be done on an atomic type wrapped by boxes or on a tensor product of type constants. It works perfectly, but raises a question: what would be a minimal extension of MLL to encode quantum circuits (leaving aside measurement) if we do not want to add a constant but do want to stick to the above style of encoding based on B as type for booleans (qubits)? We discuss on-going work in this direction.
- 10.15-10.45: Coffee Break
- 10.45-12.30: Second Session
- 10.45-11.30: Giulio Guerrieri.
Types of Fireballs.
The good properties of Plotkin's call-by-value lambda-calculus crucially rely on the restriction to weak evaluation and closed terms. Open call-by-value is the more general setting where evaluation is weak but terms may be open. Such an extension is delicate and the literature contains a number of proposals. Recently, we provided operational and implementative studies of these proposals, showing that they are equivalent with respect to termination, and also at the level of time cost models. This paper explores the denotational semantics of open call-by-value, adapting de Carvalho's analysis of call-by-name via multi types (aka non-idempotent intersection types). Our type system characterises normalisation and thus provides an adequate relational semantics. Moreover, type derivations carry quantitative information about the cost of evaluation: their size bounds the number of evaluation steps and the size of the normal form, and we also characterise derivations giving exact bounds. The study crucially relies on a new, refined presentation of the fireball calculus, the simplest proposal for open call-by-value, that is more apt to denotational investigations.
[slides]
- 11.45-12.30: Lutz Strassburger.
Proof nets for first-order additive linear logic.
We extend proof nets for additive linear logic with first-order
quantification. We present two versions of our proof nets. One, witness nets, retains explicit witnessing information to existential quantification. For the other, unification nets, this information is absent but can be reconstructed through unification. Unification nets embody a central contribution of the paper: first-order witness information can be left implicit, and reconstructed as needed.
Witness nets are canonical for first-order additive sequent calculus. Unification nets in addition factor out any inessential choice for existential witnesses. Both notions of proof net are defined through coalescence (the additive version of multiplicative contractibility), and for witness nets a geometric correctness criterion is provided. Both feature a global composition operation.
(Joint work with Willem Heijltjes and Dominic Hughes)
- 12.30-15.00: Lunch (3rd Floor) and Free Discussion
- 15.00-16.45: Third Session
- 15.00-15.45: Claudia Faggian.
Probabilistic Lambda Calculus: What about Confluence?
Functional programs are inherently parallel, because sub-expressions can be evaluated in parallel; still, we can reason on a program using a deterministic sequential model, because the result of the calculus is independent from the evaluation order. This key property - which makes functional languages well suited for parallel programming - ultimately relies on the confluence of the lambda calculus.
It is well known that confluence fails for the probabilistic lambda calculus. The typical way out is to fix a deterministic reduction strategy. This is hardly satisfactory. Is there more structure to uncover between the two extremes of full reduction -which behaves wildly- and fixing a deterministic strategy? We present on-going work which is motivated by this question, but also by the desire of better understanding the issue behind non-confluence in the probabilistic setting, and by a more general observation: very little is known about the operational semantics of the probabilistic lambda calculus.
(Joint work with Simona Ronchi della Rocca)
- 16.00-16.45: Ugo Dal Lago.
On Type Systems for Probabilsitic Termination.
Probabilistic effects are algebraic and thus fit into the well-established Plotkin-Power abstract framework, through which operational semantics and algebraic theories can easily be derived. On the other hand, probabilistic termination has a nature which is definitely different from usual, "deterministic", termination. This can be seen, as an example, when looking at the problem through the lenses of recursion theory. We will describe the problem, and then sketch the main ingredients of some recently introduced type systems for the probabilistic lambda calculus which guarantee probabilistic termination of typable terms. The first one is a generalization of sized types, while the second one is an intersection type systems, and is designed to get completeness.
[slides]
- 16.45-18.00: Free Discussion
Thursday, October 11th, room 3058
- 9.30-10.15: First Session
- Anupam Das.
A monotone version of Cook's correspondence, and beyond.
The notion of monotonicity has popped up independently in two of the three prongs of the proof theoretic approach to complexity theory: propositional proofs and computational complexity. In both areas the fundamental results challenge the usual orthodoxy: on one hand Razborov's famous '85 result gives us exponential lower bounds on the size of negation-free circuits; on the other hand Jerabek '12 showed, building on previous work by Atserias, Galesi & Pudlak '02, that monotone propositional proofs with extension (i.e. over circuits) polynomially simulate non-monotone propositional proofs with extension over monotone implications.
In this talk I will review recent work where we proposed a recursion-theoretic characterisation of positive polynomial-time functions, building on characterisations of uniform positive polynomial-time (predicates) by Lautemann, Schwentick & Stewart '96. Furthermore, we give a monotone version of Cook's PV based on this function algebra. The resulting equational theory bridges the gap between the previous monotone versions of polynomial-time studied in the settings of computational complexity and proof complexity.
(Joint work with Isabel Oitavem)
- 10.15-10.45: Coffee Break
- 10.45-12.30: Second Session
- 10.45-11.30: Nguyen Le Thanh Dung.
From finite semantics to regular languages
In the 90's, Hillebrand and Kanellakis showed that (1) simply typed lambda-terms sending Church-encoded strings to booleans decide regular languages, and proposed (2) a representation of inputs as finite relational structures overcoming this expressivity barrier. By adapting their semantic evaluation argument for (1), we get an analogous result in second-order Elementary Linear Logic (ELL), highlighting the crucial role of type fixpoints in Baillot's characterization of Ptime in ELL. This relies on the construction of a finite semantics for second-order linear logic without exponentials. Furthermore, taking inspiration from (2), we conjecture a characterization of Logspace within ELL, for which we can prove extensional completeness using descriptive complexity.
(and beyond) in second-order linear logic. [slides]
- 11.45-12.30: Romain Demangeon.
Causal Computational Complexity of Distributed Processes.
We study the complexity of pi-calculus processes with respect to the quantity of transitions caused by an incoming message. First we propose a typing system for integrating Bellantoni and Cook's characterisation of polynomially-bound recursive functions into Deng and Sangiorgi's typing system for termination. We then define computational complexity of distributed messages based on Degano and Priami's causal semantics, which identifies the dependency between interleaved transitions. Next we apply a syntactic flow analysis to typable processes to ensure the computational bound of distributed messages. We prove that our analysis is decidable for a given process; sound in the sense that it guarantees that the total number of messages causally dependent of an input request received from the outside is bounded by a polynomial of the content of this request; and complete which means that each polynomial recursive function can be computed by a typable process.
[slides]
- 12.30-14.00: Lunch (3rd Floor) and Free Discussion
- 14.00-15.45: Third Session
- 14.00-14.45: Damiano Mazza.
A Functorial Perspective on Constraint Satisfaction Problems.
We present some ongoing work on understanding constraint satisfaction problems as certain kinds of fibrations on a strict monoidal category.
- 15.00-15.45: Jean-Yves Moyen.
Implicit Complexity and Formal Proofs.
[slides]
- 15.45-16.45: Coffee Break and Free Discussion
- 16.45-18.00: ELICA Business Meeting
Venue
Internet
- EDUROAM wifi will be available at the venue.