Plenary Talks

Andrew BrookeTaylor, University of Leeds, UK
Title: Categorifying Borel reducibility
Abstract: The framework of Borel reducibility has had great success in showing that various classification programmes cannot be completed, in areas such as ergodic theory and C*algebras. However, this framework ignores a crucial feature of many classification programmes: namely, that they are expected to be functorial. I will talk about joint work in progress with Filippo Calderoni, introducing a categorified version of Borel reducibility, and noting some differences in consequences from the original framework.

Vassileios Koutavas, Trinity College Dublin, Ireland
Title: Combining Game Semantics and Bisimulation Techniques for the Verification of Program Equivalence
Abstract: Equivalence is a key concept in the area of programming language semantics as it can concisely express concepts such as modularity and abstraction and expose all the intentional and unintentional ways that different parts of a program can affect each other. There is a long history of theoretical techniques for proving equivalence in programming language semantics and related areas. More recently it has found applications in the verification of cryptographic protocols, compiler correctness and regression verification. In this talk we will overview the history of equivalence in programming language semantics, primarily focusing on the techniques of game semantics and bisimulation. We will show a new approach which combines these two theoretical techniques and leads to the development of a practical verification tool of equivalence in a programming language with functions and local state. This is a challenging setting, first studied in the semantics of ALGOL, which is quite common in modern programming languages. In this setting, verifying equivalence of even simple program expressions requires the exploration of infinite behaviours. Being able to reason within a verification tool about many cases of infinite behaviour, while still avoiding reporting any falsepositives or falsenegatives, has been the main success of this approach.

Thanases Pheidas, University of Crete, Greece
Title: Definability in Number Theory, Algebra and Geometry and Hilbert’s Tenth Problem
Abstract: We survey developments on the questions of decidability (and related definability) of the existential (and firstorder) theories of rings of common use, eg. the rational integers and rational numbers and rings of polynomials and rational functions. The starting point of our questions is “Hilbert’s Tenth Problem”.

Linda Brown Westrick, Penn State University, USA
Title: Borel sets and reverse mathematics
The talk was cancelled.
Tutorials

Alex Kavvos, University of Bristol, UK
Title: Type Theory and Homotopy
Abstract: This is a whirlwind tour of MartinLoef's Type Theory. We explain some of the basic ideas underlying typetheoretic approaches to the foundations of mathematics, introduce the rules of MLTT, and present a few examples. We also provide a sketch of the recently discovered connections to homotopy theory, the relationship with computerassisted formalised mathematics, and point the audience to recent advances in the field.

Nikos Leonardos, University of Athens, Greece
Title: Bitcoin's backbone algorithm and the consensus problem
Abstract: The purpose of the tutorial is to study Bitcoin's protocol as a distributed algorithm. We will begin by discussing the core of the protocol in a simple model that is static and synchronous. We will prove two of its fundamental properties which we call common prefix and chain quality. Subsequently, we will discuss the Consensus problem. We will show that Bitcoin solves an interesting variant of this fundamental problem. Finally, we discuss attributes of the Bitcoin protocol that allow it to be secure in more realistic models (partially synchronous model, variable difficulty, timestamp rules).

Stathis Zachos, National Technical University of Athens, Greece
Title: Introduction to Computational Complexity
Abstract: Complexity classes, Randomness, Interactivity, PCP, Counting. How to deal with NPcompleteness, Approximation algorithms, Search Complexity, Parameterized Complexity, Quantum Complexity, Counting revisited (#P, #PE, TotP),Nonuniform Circuit Families, Descriptive Complexity.
Special Sessions
Computer Science:

Bruno Bauwens, HSE University, Russia
Title: The algorithmic information distance
Abstract: The algorithmic information distance is a metric on bitstrings. In the machinelearning literature, it is defined as for bitstrings x and y as d(x,y) = max{K(xy), K(yx)} + c for some large c, where K(xy) denotes the minimal length of a program that on input y produces x on an optimal prefixfree machine. In some sense, this measure exploits all types of computable structure that can be found between 2 bitstrings. This optimality property can be formalized in a similar way as Kolmogorov complexity is optimal for measures of information content. The measure has inspired a few interesting machinelearning algorithms and we will briefly discuss some recent ones. The definition given above satisfies the triangle inequality, but it is rather technical. Therefore, the following characterization is usually mentioned: d(x,y) = min {p : U(p,x) = y and U(p, y) = x} + O(log xy). In fact, the minimum above was the historically first definition. Note that in this minimum, one considers programs p that are bidirectional: the program should simultaneously map x to y and map y to x. On the other hand, in the definition max{K(xy), K(yx)}, the conditional complexities only consider 1directional programs that either map y to x for K(xy) or x to y for K(yx).
Can we improve the precision from O(log xy) to O(1)? It is known that for plain complexity this is indeed true. But the triangle inequality does not hold for plain complexity. For a long time, it was an open question whether whether the O(1) precision holds for prefixfree machines. Recently, it has been claimed in several places that the answer is yes, but these proofs are wrong. Recently, I gave a counter example, for the equality with O(1) precision. This is remarkable, because many equalities that hold with O(log xy) precision for plain complexity can be transformed to inequalities that hold with O(1) for prefixfree complexity. A famous example is the KolmogorovLevin formula K(x,y) = K(x) + K(y  x, K(x)). For the first time, we have found an important (in)equality that holds with better precision for plain complexity. Even more mysteriously, the equality does hold with O(1) precision whenever d(x,y) > 6log xy and x and y are large, and the proof of this result is rather complex.

Juan Garay, Texas A&M University, USA
Title: Cryptography in the Blockchain Era
Abstract: The advent of blockchain protocols such as Bitcoin has ignited much excitement, not only for their realization of novel financial instruments, but also for offering alternative solutions to classical problems in faulttolerant distributed computing and cryptographic protocols. Underlying many of such protocols is a primitive known as "proof of work" (PoW), which for over 20 years has been liberally applied in the cryptography and security literature to a variety of settings, including spam mitigation, sybil attacks, and denial of service protection; its role in the design of blockchainbased protocols, however, is arguably its most impactful application. At a high level, the way PoWs enable such protocols is by slowing message passing for all parties indiscriminately, thus generating opportunities for honest parties' views to converge, under the assumption that their aggregate computational power exceeds that of the adversary. This talk comprises two parts. First, despite the evolution of our understanding of the PoW primitive, pinning down the exact properties sufficient to prove the security of Bitcoin and related protocols has been elusive. In this talk we identify such properties, and then construct protocols whose security can be reduced to them in the standard model, assuming a common reference string (CRS  cf. a "genesis" block). All previous protocols rely on the "random oracle" methodology.Second, regarding the realizability of two important problems in the area of cryptographic protocols  Byzantine agreement (BA) and secure multiparty computation (MPC)  we show how PoWbased blockchains allow to realize them even in the presence of a minority of corrupted parties (i.e., t < n/2, where t is the number of corrupted parties and n is their total number) , as long as the majority of the computation resources remain in honest hands, while "classically" (i.e., no PoWs), protocols can only tolerate up to t < n/3 corruptions in the absence of a private trusted setup, such as a publickey infrastructure. We resolve this apparent contradiction with a new paradigm we call "ResourceRestricted Cryptography."
The bulk of this talk is based on joint work with Marshall Ball, Aggelos Kiayias, Rafail Ostrovsky, Giorgos Panagiotakos and Vassilis Zikas.

Andrew LewisPye, London School of Economics, UK
Title: Chained Fever — achieving optimal synchronisation for Hotstuff
Abstract: This will be a talk of two halves, to accommodate different backgrounds in the audience. First, I’ll give an easy introduction to consensus protocols, focussing on the classical ‘permissioned’ approach rather than Bitcoin, so that there should not be too much overlap with Leonardos’ tutorial. Then I’ll describe an improvement on the stateoftheart in consensus protocols, which is a modification of ‘Hotstuff’ requiring only O(n) messages per `view change’, meaning O(n^2) message complexity to produce a confirmed block of transactions in the worst case. This improves on the previous best of O(n^3) message complexity for optimistically responsive protocols.

Vassilis Zikas, Purdue University, USA & University of Edinburgh, UK
Title: From Blockchain to Globalscale Trustworthy Infrastructure
Abstract: The wide adoption of global computer networks, such as the Internet, creates immense opportunities, and challenges the traditional centralized trust model. The idea of giving control of widelyused critical infrastructure to its users is becoming ever more popular. Blockchain and Distributed Ledger Technology (DLT) promise to bring the decentralization ideas to reality and disrupt traditional strongholds of trust in the financial, digital, biomedical, and manufacturing sectors, as well as in governance. In this talk I will discuss blockchain – from its current structure to its vast potential for future applications. The talk will discuss novel design choices that go into blockchainbased DLT, and how these choices critically impact the security of the solutions and address implementation and deployment challenges. It will also tease the potential of using reputationbased blockchain to enhance trustworthiness of decentralized worldwide systems.
Philosophical Logic:

Michael Glanzberg, Rutgers University, USA
(joint work with Lorenzo Rossi)Title: Truth and Quantification
Abstract: Theories of selfapplicable truth have been motivated in two main ways. First, if truthconditions provide the meaning of (many kinds of) natural language expressions, then selfapplicable truth is instrumental to develop the semantics of natural languages. Second, a selfapplicable truth predicate is required to express generalizations that would not be otherwise expressible in natural languages. In order to fulfill its semantic and expressive role, we argue, the truth predicate has to be studied in its interaction with constructs that are actually found in natural languages and extend beyond firstorder logicmodals, indicative conditionals, arbitrary quantifiers, and more. Here, we focus on truth and quantification. We develop a Kripkean theory of selfapplicable truth for the language of Generalized Quantifier Theory. More precisely, we show how to interpret a selfapplicable truth predicate for the full class of type <1,1> (and type <1>) quantifiers to be found in natural languages. As a result, we can model sentences which are not expressible in theories of truth for firstorder languages (such as `Most of what Jane's said is true', or `infinitely many theorems of T are untrue', and several others), thus expanding the scope of existing approaches to truth, both as a semantic and as an expressive device.

Volker Halbach, University of Oxford, UK
Title: Axiomatic Theories of Truth: A survey
Abstract: Axiomatic theories of truth are obtained by adding a unary predicate T to the language of arithmetic or another language in which syntax theory is usually developed. Then axioms for T are conjoined with the axioms of arithmetic (or another syntax theory). The liar and other paradoxes impose limits on which truth axioms can consistently be added. I survey some systems that have been seen as promising and interesting. There are several motivations for investigating axiomatic theories of truth. They are of philosophical interest not only because they give us a better understanding of the concept of truth, but also because, for instance, they can be used to reduce away commitment to secondorder objects and express generalizations over sentences. They are also of purely mathematical interest. First, proof theorists have analyzed their strength and used them as intermediary systems for the analysis of subsystem of secondorder arithmetic, but also settheoretic systems. Some axiomatic theories of truth have also intriguing properties studied by model theorists."

Elia Zardini, University of Lisbon, Portugal & HSE University, Russia
Abstract: In a series of works, P. Cobreros, P. Égré, D. Ripley and R. van Rooij have proposed a nontransitive system (call it ‘K3LP’) as a basis for a solution to the semantic paradoxes. I critically consider that proposal at three levels. At the level of the background logic, I present a conception of classical logic on which K3LP fails to vindicate classical logic not only in terms of structural principles, but also in terms of operational ones. At the level of the theory of truth, I raise a cluster of philosophical difficulties for a K3LPbased system of naive truth, all variously related to the fact that such an extension proves things that would seem already by themselves jointly repugnant, even in the absence of transitivity. At the level of the theory of validity, I consider an extension of the K3LPbased system of naive validity that is supposed to certify that validity in that system does not fall short of naive validity, argue that such an extension is untenable in that its nontriviality depends on the inadmissibility of a certain irresistible instance of transitivity (whence the advertised “final cut”) and conclude on this basis that the K3LPbased system of naive validity cannot coherently be accepted either. At all these levels, a crucial role is played by certain metaentailments and by the extra strength they afford over the corresponding entailments: on the one hand, such strength derives from considerations that would seem just as compelling in a general nontransitive framework, but, on the other hand, such strength wreaks havoc in the particular setting of K3LP. Keywords: classical logic; naive truth; naive validity; nontransitive logics."