The last two decades have seen a wave of exciting new developments in the theory of algorithmic randomness and its applications to other areas of mathematics. This volume surveys much of the recent work that has not been included in published volumes until now. It contains a range of articles on algorithmic randomness and its interactions with closely related topics such as computability theory and computational complexity, as well as wider applications in areas of mathematics including analysis, probability, and ergodic theory. In addition to being an indispensable reference for researchers in algorithmic randomness, the unified view of the theory presented here makes this an excellent entry point for graduate students and other newcomers to the field.
Marker (mathematics, statistics, and computer science; U. of Illinois-Chicago), Margit Messmer, and Anand Pilay (both: mathematics, U. of Illinois-Urbana-Champaign) present four lectures for a graduat
Classical computable model theory is most naturally concerned with countable domains. There are, however, several methods – some old, some new – that have extended its basic concepts to uncountable structures. Unlike in the classical case, however, no single dominant approach has emerged, and different methods reveal different aspects of the computable content of uncountable mathematics. This book contains introductions to eight major approaches to computable uncountable mathematics: descriptive set theory; infinite time Turing machines; Blum-Shub-Smale computability; Sigma-definability; computability theory on admissible ordinals; E-recursion theory; local computability; and uncountable reverse mathematics. This book provides an authoritative and multifaceted introduction to this exciting new area of research that is still in its early stages. It is ideal as both an introductory text for graduate and advanced undergraduate students and a source of interesting new approaches for
Alan Turing was an inspirational figure who is now recognised as a genius of modern mathematics. In addition to leading the Allied forces' code-breaking effort at Bletchley Park in World War II, he proposed the theoretical foundations of modern computing and anticipated developments in areas from information theory to computer chess. His ideas have been extraordinarily influential in modern mathematics and this book traces such developments by bringing together essays by leading experts in logic, artificial intelligence, computability theory and related areas. Together, they give insight into this fascinating man, the development of modern logic, and the history of ideas. The articles within cover a diverse selection of topics, such as the development of formal proof, differing views on the Church–Turing thesis, the development of combinatorial group theory, and Turing's work on randomness which foresaw the ideas of algorithmic randomness that would emerge many years later.
The proceedings of the Los Angeles Caltech-UCLA 'Cabal Seminar' were originally published in the 1970s and 1980s. Ordinal Definability and Recursion Theory is the third in a series of four books collecting the seminal papers from the original volumes together with extensive unpublished material, new papers on related topics and discussion of research developments since the publication of the original volumes. Focusing on the subjects of 'HOD and its Local Versions' (Part V) and 'Recursion Theory' (Part VI), each of the two sections is preceded by an introductory survey putting the papers into present context. These four volumes will be a necessary part of the book collection of every set theorist.
This collection of papers from various areas of mathematical logic showcases the remarkable breadth and richness of the field. Leading authors reveal how contemporary technical results touch upon foundational questions about the nature of mathematics. Highlights of the volume include: a history of Tennenbaum's theorem in arithmetic; a number of papers on Tennenbaum phenomena in weak arithmetics as well as on other aspects of arithmetics, such as interpretability; the transcript of Gödel's previously unpublished 1972–1975 conversations with Sue Toledo, along with an appreciation of the same by Curtis Franks; Hugh Woodin's paper arguing against the generic multiverse view; Anne Troelstra's history of intuitionism through 1991; and Aki Kanamori's history of the Suslin problem in set theory. The book provides a historical and philosophical treatment of particular theorems in arithmetic and set theory, and is ideal for researchers and graduate students in mathematical logic and philosophy
The proceedings of the Los Angeles Caltech-UCLA 'Cabal Seminar' were originally published in the 1970s and 1980s. Wadge Degrees and Projective Ordinals is the second of a series of four books collecting the seminal papers from the original volumes together with extensive unpublished material, new papers on related topics and discussion of research developments since the publication of the original volumes. Focusing on the subjects of 'Wadge Degrees and Pointclasses' (Part III) and 'Projective Ordinals' (Part IV), each of the two sections is preceded by an introductory survey putting the papers into present context. These four volumes will be a necessary part of the book collection of every set theorist.
This book presents and applies a framework for studying the complexity of algorithms. It is aimed at logicians, computer scientists, mathematicians and philosophers interested in the theory of computation and its foundations, and it is written at a level suitable for non-specialists. Part I provides an accessible introduction to abstract recursion theory and its connection with computability and complexity. This part is suitable for use as a textbook for an advanced undergraduate or graduate course: all the necessary elementary facts from logic, recursion theory, arithmetic and algebra are included. Part II develops and applies an extension of the homomorphism method due jointly to the author and Lou van den Dries for deriving lower complexity bounds for problems in number theory and algebra which (provably or plausibly) restrict all elementary algorithms from specified primitives. The book includes over 250 problems, from simple checks of the reader's understanding, to current open
Kurt Gödel (1906–1978) did groundbreaking work that transformed logic and other important aspects of our understanding of mathematics, especially his proof of the incompleteness of formalized arithmetic. This book on different aspects of his work and on subjects in which his ideas have contemporary resonance includes papers from a May 2006 symposium celebrating Gödel's centennial as well as papers from a 2004 symposium. Proof theory, set theory, philosophy of mathematics, and the editing of Gödel's writings are among the topics covered. Several chapters discuss his intellectual development and his relation to predecessors and contemporaries such as Hilbert, Carnap, and Herbrand. Others consider his views on justification in set theory in light of more recent work and contemporary echoes of his incompleteness theorems and the concept of constructible sets.
Infinitary logic, the logic of languages with infinitely long conjunctions, plays an important role in model theory, recursion theory and descriptive set theory. This book is the first modern introduction to the subject in forty years, and will bring students and researchers in all areas of mathematical logic up to the threshold of modern research. The classical topics of back-and-forth systems, model existence techniques, indiscernibles and end extensions are covered before more modern topics are surveyed. Zilber's categoricity theorem for quasiminimal excellent classes is proved and an application is given to covers of multiplicative groups. Infinitary methods are also used to study uncountable models of counterexamples to Vaught's conjecture, and effective aspects of infinitary model theory are reviewed, including an introduction to Montalbán's recent work on spectra of Vaught counterexamples. Self-contained introductions to effective descriptive set theory and hyperarithmetic
Arising from a special session held at the 2010 North American Annual Meeting of the Association for Symbolic Logic, this volume is an international cross-disciplinary collaboration with contributions from leading experts exploring connections across their respective fields. Themes range from philosophical examination of the foundations of physics and quantum logic, to exploitations of the methods and structures of operator theory, category theory, and knot theory in an effort to gain insight into the fundamental questions in quantum theory and logic. The book will appeal to researchers and students working in related fields, including logicians, mathematicians, computer scientists, and physicists. A brief introduction provides essential background on quantum mechanics and category theory, which, together with a thematic selection of articles, may also serve as the basic material for a graduate course or seminar.
This book is an up-to-date introduction to simple theories and hyperimaginaries, with special attention to Lascar strong types and elimination of hyperimaginary problems. Assuming only knowledge of general model theory, the foundations of forking, stability and simplicity are presented in full detail. The treatment of the topics is as general as possible, working with stable formulas and types and assuming stability or simplicity of the theory only when necessary. The author offers an introduction to independence relations as well as a full account of canonical bases of types in stable and simple theories. In the last chapters the notions of internality and analyzability are discussed and used to provide a self-contained proof of elimination of hyperimaginaries in supersimple theories.
Descriptive complexity theory establishes a connection between the computational complexity of algorithmic problems (the computational resources required to solve the problems) and their descriptive complexity (the language resources required to describe the problems). This groundbreaking book approaches descriptive complexity from the angle of modern structural graph theory, specifically graph minor theory. It develops a 'definable structure theory' concerned with the logical definability of graph theoretic concepts such as tree decompositions and embeddings. The first part starts with an introduction to the background, from logic, complexity, and graph theory, and develops the theory up to first applications in descriptive complexity theory and graph isomorphism testing. It may serve as the basis for a graduate-level course. The second part is more advanced and mainly devoted to the proof of a single, previously unpublished theorem: properties of graphs with excluded minors are
The proceedings of the Los Angeles Caltech-UCLA 'Cabal Seminar' were originally published in the 1970s and 1980s. Large Cardinals, Determinacy and Other Topics is the final volume in a series of four books collecting the seminal papers from the original volumes together with extensive unpublished material, new papers on related topics and discussion of research developments since the publication of the original volumes. This final volume contains Parts VII and VIII of the series. Part VII focuses on 'Extensions of AD, models with choice', while Part VIII ('Other topics') collects material important to the Cabal that does not fit neatly into one of its main themes. These four volumes will be a necessary part of the book collection of every set theorist.
The study of NIP theories has received much attention from model theorists in the last decade, fuelled by applications to o-minimal structures and valued fields. This book, the first to be written on NIP theories, is an introduction to the subject that will appeal to anyone interested in model theory: graduate students and researchers in the field, as well as those in nearby areas such as combinatorics and algebraic geometry. Without dwelling on any one particular topic, it covers all of the basic notions and gives the reader the tools needed to pursue research in this area. An effort has been made in each chapter to give a concise and elegant path to the main results and to stress the most useful ideas. Particular emphasis is put on honest definitions, handling of indiscernible sequences and measures. The relevant material from other fields of mathematics is made accessible to the logician.