12th Panhellenic Logic Symposium

June 26-30, 2019
Anogeia, Crete, Greece

Plenary Talks

Tutorial

Special Sessions


Computer Science:


Model Theory:


Philosophy & Set Theory:

Joan Bagaria: On Woodin’s HOD Conjecture, large cardinals beyond Choice, and class forcing

Woodin’s HOD Conjecture asserts that the theory "ZFC plus there exists an extendible cardinal" proves that the universe V of sets is very close to HOD (the universe of Hereditarily Ordinal-Definable sets). While Woodin’s work on the construction of his Ultimate-L model has provided ample evidence in favour of the HOD Conjecture, there remains a faint possibility that some very large cardinals inconsistent with ZFC, such as Berkeley cardinals, turn out to be consistent with ZF, which would deny the HOD Conjecture. Moreover, some recent results on the preservation of extendible cardinals under very general class forcing iterations may yield further evidence for a V far from HOD scenario.

Fernando Ferreira: A new take on proof mining

Proof mining is a program whose aim is to formulate quantitative versions of theorems of mathematics and, from the proof of the theorem, extract explicit bounding information. In the first part of the talk, we emphasize the first aspect of proof mining. We give two examples, one in analysis, the other in algebra, where the so-called bounded functional interpretation helps in formulating appropriate quantitative versions. In the second part of the talk, we argue that some principles of the bounded functional interpretation - to wit the very same that guide the formulation of the quantitative versions - help in explaining why certain minings are possible. These principles generalize weak König’s lemma and yield conservation results (one of which is Harvey Friedman’s well-known theorem). A curious feature is that these principles may be false (in contrast to weak König’s lemma) but, by conservativity, they only have true quantitative consequences. We illustrate this latter feature with the discussion of a famous fixed point theorem of Felix Browder.

Simona Ronchi Della Rocca: Logics, Programming Languages and Implicit Computational Complexity

Implicit Computational Complexity (ICC) is a research area whose aim is twofold:
1) to revise the classical complexity theory by studying complexity classes without referring to explicit machine models or external measures, but instead by considering restrictions on programming languages and calculi;
2) to design programming languages with bounded computational complexity.
I will present some results in both of the previous lines, which have been obtained using tools inspired by (variants of) Linear Logic, by exploiting the so called Curry-Howard isomorphism, which connects logical formulas with types and proofs with programs. In particular I will briefly illustrate abstract characterizations of PTIME, PSPACE and NP complexity classes, obtained through this approach.

Mehrnoosh Sadrzadeh: Principles of Natural Language, Logic, and Tensor Semantics

The first formal approaches to natural language go back to the division calculus of Ajdukiewic in the 30’s, where structures similar to groups were used to provide a functional interpretation for grammatical types and their composition. In the 50’s, these systems were refined with two, rather than one, division operators and Lambek developed a residuated monoid semantics and a cut-free sequent calculus for them. I will show how one can develop a vector space semantics for residuated monoids and how this solves an open problem of the field of “distributional semantics”, in Statistical Natural Language Processing. This semantics provides higher order tensor representations for sentences by composing the vectors/tensors of the words therein, themselves populated by the statistics of occurrence in large corpora of data. I will present experimental results showing that these models beat non compositional baselines in tasks such as disambiguation, similarity and entailment. I will also go through recent work where adding a copying and moving operation to restated monoids enables us to lift the models from sentence level to the level of discourse and reason about phenomena such as ellipsis and anaphora. These models enjoy a categorical foundation in terms of functors between compact closed categories and Frobenius and Bialgebras over them.

Theodore Slaman: Diophantine Approximation and Recursion Theory

We will discuss aspects of Diophantine Approximation which are motivated and informed by Recursion Theoretic considerations.

Patrick Speissegger: Limit cycles of planar vector fields, Hilbert’s 16th problem and o-minimality

Recent work links certain aspects of the second part of Hilbert’s 16th problem (H16) to the theory of o-minimality. One of these aspects is the generation and destruction of limit cycles in families of planar vector fields, commonly referred to as ”bifurcations”. I will outline the significance of bifurcations for H16 and explain how logic–in particular, o-minimality–can be used to understand them well enough to be able to count limit cycles.

Yannis Stephanou: A Theory of Truth with a Determinacy Operator

Any formal theory of truth has to deal with the semantic paradoxes, of which the simplest is the liar: the paradox of the sentence, (L), that describes itself as not being true. The talk will first adumbrate a theory of truth that tackles the paradoxes by relying on an appropriately constructed non-classical logic. The theory incorporates neither "(L) is true" nor "(L) is not true". It includes the biconditional "(L) is true iff (L) is not true", as well as all other instances of the T-schema that are formulated in the language of the theory (the T-schema being the schema "S is true iff p" where "p" is to be replaced with a sentence and "S" is to be replaced with a name of that sentence). The theory, however, does not include the contradiction "(L) is both true and not true". In fact, it includes "It is not the case that (L) is both true and not true", though not the disjunction "(L) is either true or not true". Similarly, the theory sanctions the statements "It is not the case that (L) is true and false" and "It is not the case that (L) is not true and not false", but not "(L) is true or false". It can be proved that the theory has a model, in a non-classical sense of "model", and so contains no contradiction. This part of the talk will survey work that has been published.

We will then turn to an operator Δ such that Δp means "It is (objectively) determinate whether p", "Reality (irrespective of what we can know about it) contains an answer to the question whether p". The concept expressed by Δ has been used by philosophers in various contexts. In particular, one may well have the intuition that it is not determinate whether (L), or any other paradoxical sentence, is true. Also, Δ allows us to formulate some additional paradoxes. The talk will sketch a logic and a theory which add that operator to the above-mentioned non-classical logic and theory of truth. The logic of Δ will give us a philosophical reason in favour of an aspect of the whole approach which might seem far-fetched, namely, its abandonment of the rule of conditional proof. It will also allow us to define another operator, ⊡, which means "It is determinately the case that ..." and is reminiscent of the necessity operators of modal logic. The truth-theory with Δ comes in more than one version. One version incorporates the statement "It is not determinate whether (L) is true". Another version does not. The latter may be preferable, since by claiming "It is not determinate whether (L) is true", we imply that reality contains an answer to the question whether (it is determinate whether (L) is true). But reality may contain no such answer. All versions include all instances of the T-schema that are formulated in their language, and it can be proved that each version has a model, again in a non-classical sense of "model".

Boris Zilber: Anabelian geometry in model theory setting.

The talk is about interaction of model theory and algebraic/arithmetic geometry. It is also about comparative power of model theory versus category theory.

I will present a model-theoretic formalism for treating analytic and etale covers of algebraic varieties. This allows a reformulation of Grothendieck's anabelian geometry. It also allows to consider questions of categoricity of respective non-elementary theories (or rather Shelah's abstract elementary classes).

A series of results obtained by various authors in 2002-2017 for semi-abelian varieties demonstrated that categoricity is equivalent to the classification of the action of Galois groups on the torsion subgroups together with relevant Kummer theory. In anabelian cases the assumption of categoricity leads directly to conjectures about the action of Galois group on pro-finite fundamental group first raised in Grothendieck's ``Esquisse d'un programme''. Our results shed a new light on these issues.

Tutorial

Mirna Džamonja: Forcing

It has been 55 or so years that the technique of forcing was discovered. The three hours of the tutorial aim to cover forcing through a choice of three main periods of its development and use. The first hour will go through the basics, which will allow those who are not familiar with the technique to follow the rest of the tutorial. The second hour will show some of the classic uses of forcing and will include a presentation of several forcing axioms. The final hour will be devoted to the contemporary developments.

Computer Science Special Session

Sylvain Schmitz: Well-quasi-orders in Logic

Well-quasi-orders are a versatile tool for proving expressivity and decidability results. The talk will provide a glimpse of their applications in logic, through examples from proof theory, finite model theory, and verification.

Ana Sokolova: Semantics for Probability and Nondeterminism via Coalgebra

In this talk I will introduce the very basics of coalgebra, and show you how different semantics (bisimilarity, convex bisimilarity, distribution bisimilarity, trace semantics) for systems with probability and nondeterminism elegantly fit in the theory of coalgebra.

Niki Vazou: Liquid Haskell: Theorem Proving for All

Formal verification has been gaining the attention and resources of both the academic and the industrial world since it prevents critical software bugs that cost money, energy, time, and even lives. Yet, software development and formal verification are decoupled, requiring verification experts to prove properties of a template – instead of the actual – implementation ported into verification specific languages. Niki's goal is to bridge formal verification and software development for the programming language Haskell. Haskell is a unique programming language in that it is a general purpose, a functional language used for industrial development, but simultaneously it stands at the leading edge of research and teaching welcoming new, experimental, yet useful features. In this talk, Niki is presenting Liquid Haskell, a refinement type checker in which formal specifications are expressed as a combination of Haskell’s types and expressions and are automatically checked against real Haskell code. This natural integration of specifications in the language, combined with automatic checking, established Liquid Haskell as a usable verifier, enthusiastically accepted by both industrial and academic Haskell users. Recently, Niki turned Liquid Haskell into a theorem prover, in which arbitrary theorems about Haskell functions would be proved within the language. As a consequence, Liquid Haskell can be used to prove theorems about Haskell functions by all Haskell programmers.

Model Theory Special Session

Assaf Hasson: On Zilber's restricted trichotomy conjecture

Uncountably categorical structures are controlled by definable geometric structures known as strongly minimal sets. In the late 1970s Zilber conjectured that the geometries of strongly minimal sets are either trivial, locally modular (i.e., of linear type) or the geometries associated with algebraic independence in algebraically closed fields. Hrushovski refuted Zilber's conjecture, in essentially, all its possible variant. There are quite a few examples, however, where Zilber's conjecture, when restricted to strongly minimal sets that can be defined in a structure supporting a well behaved (in a non-technical sense) topology, can be proved. In the talk we will explain Zilber's conjecture, discuss some of the contexts where it is hoped it can be proved, and -- if time allows -- survey some partial results.

Gareth Jones: Some model theory of elliptic functions

Elliptic functions occur naturally in several different parts of mathematics. I will give a survey of some results on the model theory of these functions, and discuss some open questions and applications.

Tobias Kaiser: Integration in non-archimedean subanalytic geometry

In real analytic geometry semianalytic and subanalytic sets are studied. Globally subanalytic sets and functions exhibit particular tame geometric behaviour. We establish a Lebesgue measure and integration theory in non-archimedean globally subanalytic geometry. To be more precise, we work in a model of the theory of the real field with restricted analytic functions such that its value group has finite archimedean rank. An example is given by the field of Puiseux series over the reals. We show how one can extend the restricted logarithm to a global logarithm with values in the polynomial ring over the model with dimension the archimedean rank. We illustrate how one can embed such a logarithm into a model of the real field with restricted analytic functions and exponentiation. This allows us, using model theoretic arguments, to establish a full Lebesgue measure and integration theory with values in the polynomial ring.

Philosophy and Set Theory Special Session

Carolin Antos: Considering set-theoretic practice

The philosophy of mathematical practice is a recent development in the wider field of philosophy of mathematics. In the early outset it was a departure from classical, mostly foundational questions in the philosophy of mathematics such as for example to the ontological status of mathematical objects or how we can gain knowledge about them. Instead it focused on the real day-to-day practice of mathematicians and the work they are doing. One obvious way to bring these two strands together is to look at the practice that is done by mathematicians working on the foundations of mathematics, as for example set theory. Independently from this development, set-theoretic practice is a frequently used argumentative device in current discussions in the philosophy of set theory, such as the universe/multiverse debate.

In this talk we present work on analyzing set-theoretic practice as used in these discussion with quantitative methods developed in the setting of philosophy of mathematical practice. In particular we want to examine J. Hamkins' claim that modern set-theoretic practice supports a pluralistic view on the foundations of set theory.

Dima Sinapova: The tree property and its strengthenings

Two central themes in logic are how much the universe of sets resembles Gödel's constructible universe L versus what is possible from forcing and large cardinals. Both are addressed by using infinitary combinatorics to investigate how much compactness can be obtained in the universe. Compactness is the phenomenon when a given property holding for every smaller substructure of some object implies that property holds for the object itself. This is usually a consequence of large cardinals, and tends to fail in L.

A key instance of compactness is the tree property, which states that every tree of height κ and levels of size less than κ has a cofinal branch. Informally, this principle is a generalization of König's infinity lemma to uncountable cardinals. It turns out that the tree property and certain strengthenings capture the combinatorial essence of large cardinals. An old project in set theory is to force the tree property and its strengthenings at every regular cardinal greater than ℵ1. I will go over the background and then discuss some recent results giving the state of the art of this project.