Margins
Cambridge Tracts in Theoretical Computer Science book cover 1
Cambridge Tracts in Theoretical Computer Science book cover 2
Cambridge Tracts in Theoretical Computer Science book cover 3
Cambridge Tracts in Theoretical Computer Science
Series · 32
books · 1987-2013

Books in series

Logic and Computation book cover
#1

Logic and Computation

Interactive Proof with Cambridge LCF

1987

Logic and Computation is concerned with techniques for formal theorem-proving, with particular reference to Cambridge LCF (Logic for Computable Functions). Cambridge LCF is a computer program for reasoning about computation. It combines methods of mathematical logic with domain theory, the basis of the denotational approach to specifying the meaning of statements in a programming language. This book consists of two parts. Part I outlines the mathematical preliminaries: elementary logic and domain theory. They are explained at an intuitive level, giving references to more advanced reading. Part II provides enough detail to serve as a reference manual for Cambridge LCF. It will also be a useful guide for implementors of other programs based on the LCF approach.
Formal Semantics and Pragmatics for Natural Language Querying book cover
#3

Formal Semantics and Pragmatics for Natural Language Querying

1990

This book discusses the connection between two areas of semantics, namely the semantics of databases and the semantics of natural language, and links them via a common view of the semantics of time. It is argued that a coherent theory of the semantics of time is an essential ingredient for the success of efforts to incorporate more 'real world' semantics into database models. This idea is a relatively recent concern of database research but it is receiving growing interest. The book begins with a discussion of database querying which motivates the use of the paradigm of Montague Semantics and discusses the details of the intensional logic ILs. This is followed by a description of the author's own model, the Historical Relational Data Model (HRDM) which extends the RDM to include a temporal dimension. Finally the database querying language QEHIII is defined and examples illustrate its use. A formal model for the interpretation of questions is presented in this work which will form the basis for much further research.
Applications of Process Algebra book cover
#4

Applications of Process Algebra

1990

This book gives applications of the theory of process algebra, or Algebra of Communicating Processes (ACP), that is the study of concurrent or communicating processes studied using an algebraic framework. The approach is axiomatic; the authors consider structures that are some set of mostly equational axioms, which are equipped with several operators. Thus the term 'algebra' is used in the model-theoretic sense. The axiomatic approach enables one to organize the field of process theories. The theory is applied systematically to a number of situations, including systolic algorithms, semantics of an object-oriented language, and protocols. It will be welcomed by computer scientists working in parallel programming.
Process Algebra book cover
#5

Process Algebra

1990

Process algebra, also known as the Algebra of Communicating Processes (ACP), is a mathematical theory or model of computer processes, particularly concurrent systems. In this introduction to ACP, the authors describe an algebraic formalism that can be used to specify, apply logic to, and improve parallel systems. Their axiomatic approach permits intensive calculations to be performed and establishes a uniform framework in which a range of models can be investigated. The book addresses issues such as parallelism, communication, abstraction, non-determinism, and fairness and provides many examples, so it should appeal to computer scientists and mathematicians alike. It is a unique introduction to this model of concurrent programming and will be essential reading for all computer scientists interested in parallel processing and algebraic methods in computer science.
Updating Logical Databases book cover
#6

Updating Logical Databases

1990

This book tackles the problems of update algorithms for databases. How can one construct and maintain a database of facts, capable of incorporating new information and getting rid of all outdated information, and yet in the process not disturb any other information in the database? The author has produced a formal method of specifying the desired change intentionally using a "formula-based" approach to updating needs rather than a "model-based" technique. The complexity of the algorithms, choice of semantics and means of enforcing integrity constraints are also discussed.
Theoretical Foundations of VLSI Design book cover
#7

Theoretical Foundations of VLSI Design

1990

This book discusses recent research in the theoretical foundations of several subjects of importance for the design of hardware, and for computer science in general. The physical technologies of very large scale integration (VLSI) are having major effects on the electronic industry. The potential diversity and complexity of digital systems have begun a revolution in the technologies of digital design, involving the application of concepts and methods to do with algorithms and programming. In return, the problems of VLSI design have led to new subjects becoming of importance in computer science. Topics covered in this volume include: models of VLSI complexity; complexity theory; systolic algorithm design; specification theory; verification theory; design by stepwise refinement and transformations. A thorough literature survey with an exhaustive bibliography is also included. The book has grown from a workshop held at the Centre for Theoretical Computer Science at Leeds University and organised by the editors.
A Unifying Framework for Structured Analysis and Design Models book cover
#8

A Unifying Framework for Structured Analysis and Design Models

An Approach Using Initial Algebra Semantics and Category Theory

1991

Structured methodologies are a popular and powerful tool in information systems development. Many different ones exist, each employing a number of models and so a specification must be converted from one form to another during the development process. To solve this problem, Dr. Tse proposes a unifying framework behind popular structured models. He approaches the problem from the viewpoints of algebra and category theory. He not only develops the frameworks but also illustrates their practical and theoretical usefulness. Thus, this book will provide insight for software engineers into how methodologies can be formalized, and will open up a range of applications and problems for theoretical computer scientists.
Design Theory and Computer Science book cover
#9

Design Theory and Computer Science

1991

The author examines logic and methodology of design from the perspective of computer science. Computers provide the context for this examination both by discussion of the design process for hardware and software systems and by consideration of the role of computers in design in general. The central question posed by the author is whether or not we can construct a theory of design.
Nets, Terms and Formulas book cover
#10

Nets, Terms and Formulas

Three Views of Concurrent Processes and their Relationship

1991

The author presents a theory whereby nets, terms and formulas represent concurrent process at three levels of Petri nets are used to describe all details of the operational machine behavior of processes. Algebraic process terms such as Calculus of Communicating Systems (CCS) or Communicating Sequential Processes (CSP) two of the main models for parallel programming, are used to describe the architecture of processes, i.e. how they can be composed from subprocesses. Logical formulas of a first-order trace logic are used to describe the safety and liveness aspects of the communication behavior of processes as required by their users. The main emphasis and technical contribution of this theory are transformations for a top-down design of concurrent processes starting with formulas and proceeding through terms to nets.
Belief Revision book cover
#11

Belief Revision

1992

Belief revision is a topic of much interest in theoretical computer science and logic, and it forms a central problem in research into artificial intelligence. In simple how do you update a database of knowledge in the light of new information? What if the new information is in conflict with something that was previously held to be true? An intelligent system should be able to accommodate all such cases. This book contains a collection of research articles on belief revision that are completely up to date and an introductory chapter that presents a survey of current research in the area and the fundamentals of the theory. Thus this volume will be useful as a textbook on belief revision.
Programs, Recursion and Unbounded Choice book cover
#12

Programs, Recursion and Unbounded Choice

1992

Predicate transformation semantics is the best specification method for the development of correct and well-structured computer programs. This book is a complete account of the predicate transformation calculus semantics of sequential programs, including repetitions, recursive procedures, computational induction, and unbounded nondeterminacy. The author develops their theory to a greater depth than has been achieved before, and describes it in a way that makes it readily compatible with programming methodology. He gives proofs of the programming rules for partial and total correctness of repetitions and recursive procedures, supplies new rules for proving incorrectness, and a stronger rule for proving that two programs satisfy the same specifications. Finally, the semantics are extended so that non-terminating programs can be specified as well.
The Logic of Typed Feature Structures book cover
#13

The Logic of Typed Feature Structures

With Applications to Unification Grammars, Logic Programs and Constraint Resolution

1992

This book develops the theory of typed feature structures, a new form of data structure that generalizes both the first-order terms of logic programs and feature-structures of unification-based grammars to include inheritance, typing, inequality, cycles and intensionality. It presents a synthesis of many existing ideas into a uniform framework, which serves as a logical foundation for grammars, logic programming and constraint-based reasoning systems. Throughout the text, a logical perspective is adopted that employs an attribute-value description language along with complete equational axiomatizations of the various systems of feature structures. Efficiency concerns are discussed and complexity and representability results are provided. The application of feature structures to phrase structure grammars is described and completeness results are shown for standard evaluation strategies. Definite clause logic programs are treated as a special case of phrase structure grammars. Constraint systems are introduced and an enumeration technique is given for solving arbitrary attribute-value logic constraints. This book with its innovative approach to data structures will be essential reading for researchers in computational linguistics, logic programming and knowledge representation. Its self-contained presentation makes it flexible enough to serve as both a research tool and a textbook.
Two-Level Functional Languages book cover
#14

Two-Level Functional Languages

1992

The authors describe here a framework in which the type notation of functional languages is extended to include a notation for binding times (that is run-time and compile-time) that distinguishes between them. Consequently, the ability to specify code and verify program correctness can be improved. Two developments are needed, the first of which introduces the binding time distinction into the lambda calculus in a manner analogous with the introduction of types into the untyped lambda calculus. Methods are also presented for introducing combinators for run-time. The second concerns the interpretation of the resulting language, which is known as the mixed lambda-calculus and combinatory logic. The notion of "parametrized semantics" is used to describe code generation and abstract interpretation. The code generation is for a simple abstract machine designed for the purpose, it is close to the categorical abstract machine. The abstract interpretation focuses on a strictness analysis that generalizes Wadler's analysis for lists. It is also shown how the results of abstract interpretation may be used to improve the code generation.
Predicate Transformer Semantics book cover
#15

Predicate Transformer Semantics

1992

A central problem in the design of programming systems is to provide methods for verifying that computer code performs to specification. This book presents a rigorous foundation for defining Boolean categories, in which the relationship between specification and behaviour can be explored. Boolean categories provide a rich interface between program constructs and techniques familiar from algebra, for instance matrix- or ideal-theoretic methods. The book's distinction is that the approach relies on only a single program construct (the first-order theory of categories), the others being derived mathematically from four axioms. Development of these axioms (which are obeyed by an abundance of program paradigms) yields Boolean algebras of 'predicates', loop-free constructs, and a calculus of partial and total correctness which is shown to be the standard one of Hoare, Dijkstra, Pratt, and Kozen. The book is based in part on courses taught by the author, and will appeal to graduate students and researchers in theoretical computer science.
Action Semantics book cover
#16

Action Semantics

1992

Action Semantics is a novel approach to the formal description of programming languages. Its abstractness is at an intermediate level, between that of denotational and operational semantics. Action Semantics has considerable pragmatic advantages over all previous approaches, in its comprehensibility and accessibility, and especially in the usefulness of its semantic descriptions of realistic programming languages. In this volume, Dr Peter Mosses gives a thorough introduction to action semantics, and provides substantial illustrations of its use. Graduates of computer science or maths who have an interest in the semantics of programming languages will find Action Semantics a most helpful book.
Formal Specification and Design book cover
#17

Formal Specification and Design

1992

Formal specification is a method for precisely modelling computer-based systems that combines concepts from software engineering and mathematical logic. In this book the authors describe algebraic and state-based specification techniques from the unified view of the Common Object-oriented Language for Design, COLD, a wide-spectrum language in the tradition of VDM and Z. The kernel language is explained in detail, with many examples, including: set representation, a display device, an INGRES-like database system, and a line editor. Fundamental techniques such as initial algebra semantics, loose semantics, partial functions, hiding, sharing, predicate and dynamic logic, abstraction functions, representation of invariants and black-box correctness are also presented. More advanced ideas, for example Horn logic, and large systems are given in the final part. Appendices contain full details of the language's syntax and a specification library. Techniques for software development and design are emphasised throughout, so the book will be an excellent choice for courses in these areas.
Deductive and Declarative Programming book cover
#18

Deductive and Declarative Programming

1992

In this book, the author develops deduction-oriented methods for reasoning about functional and logic programs. The methods are based on the inductive theories of suitable data type specifications and exploit both classical theorem-proving and term rewriting. Detailed examples accompany the development of the methods, and their use is supported by a prototyping system that is documented at the end of the book.
The Clausal Theory of Types book cover
#19

The Clausal Theory of Types

1993

This book presents the theoretical foundation of a higher-order logic programming language with equality, based on the clausal theory of types. A long-sought goal of logic programming, the clausal theory of types is a logic programming language that allows functional computation as a primitive operation while having rigorous, sound, and complete declarative and operational semantics. The language is very powerful, supporting higher-order equational deduction and functional computation. Its higher order syntax makes it concise and expressive, abstract data types can be expressed in it, and searching for multiple solutions is a basic operation. The author proves a number of important and surprising a Skolem-Herbrand-Gödel theorem for higher-order logic; a Higher-Order Resolution Theorem, which includes as special cases some previously unproven conjectures about equational matching and higher-order matching.
Algebraic Specification of Communication Protocols book cover
#20

Algebraic Specification of Communication Protocols

1993

Modern computer networks now circle the world, but the transmission of information among them depends on the many different protocols that define the behavior of the sender and receiver. It is clear therefore, that the accurate description of these protocols is important if harmonious communication is to be maintained. In this book the authors use the formal specification language PSF to provide an unambiguous description of several communication protocols of varying levels of complexity, ranging from the alternating bit protocol to the token ring protocol. Beginners, as well as professionals in the field of communication protocols, will benefit from both the methods of specification described, and the protocols discussed in this book.
Higher Order Logic and Hardware Verification book cover
#21

Higher Order Logic and Hardware Verification

1993

Dr. Melham shows here how formal logic can be used to specify the behavior of hardware designs and reason about their correctness. A primary theme of the book is the use of abstraction in hardware specification and verification. The author describes how certain fundamental abstraction mechanisms for hardware verification can be formalized in logic and used to express assertions about design correctness and the relative accuracy of models of hardware behavior. His approach is pragmatic and driven by examples. He also includes an introduction to higher-order logic, which is a widely used formalism in this subject, and describes how that formalism is actually used for hardware verification. The book is based in part on the author's own research as well as on graduate teaching. Thus it can be used to accompany courses on hardware verification and as a resource for research workers.
Metamathematics, Machines and Gödel's Proof book cover
#22

Metamathematics, Machines and Gödel's Proof

1994

The automatic verification of large parts of mathematics has been an aim of many mathematicians from Leibniz to Hilbert. While Gödel's first incompleteness theorem showed that no computer program could automatically prove certain true theorems in mathematics, the advent of electronic computers and sophisticated software means in practice there are many quite effective systems for automated reasoning that can be used for checking mathematical proofs. This book describes the use of a computer program to check the proofs of several celebrated theorems in metamathematics including those of Gödel and Church-Rosser. The computer verification using the Boyer-Moore theorem prover yields precise and rigorous proofs of these difficult theorems. It also demonstrates the range and power of automated proof checking technology. The mechanization of metamathematics itself has important implications for automated reasoning, because metatheorems can be applied as labor-saving devices to simplify proof construction.
Mathematical Theory of Domains book cover
#23

Mathematical Theory of Domains

1994

Domain theory is the mathematical framework that is used to model the semantics of computer programs and the theory of computation. This is the first book on the subject that attempts to provide a rigorous introduction to the topic in a manner accessible to computer scientists by motivating the mathematics with computer science examples.
Free Choice Petri Nets book cover
#24

Free Choice Petri Nets

1995

Petri nets are a popular and powerful formal model for the analysis and modelling of concurrent systems, and a rich theory has developed around them. Petri nets are taught to undergraduates, and also used by industrial practitioners. This book focuses on a particular class of petri nets, free choice petri nets, which play a central role in the theory. The text is very clearly organised, with every notion carefully explained and every result proved. Clear exposition is given for place invariants, siphons, traps and many other important analysis techniques. The material is organised along the lines of a course book, and each chapter contains numerous exercises, making this book ideal for graduate students and research workers alike.
The Uncertain Reasoner's Companion book cover
#25

The Uncertain Reasoner's Companion

A Mathematical Perspective

1994

Reasoning under uncertainty, that is, making judgments with only partial knowledge, is a major theme in artificial intelligence. Professor Paris provides here an introduction to the mathematical foundations of the subject. The author presents the key results on the subject, and formalizes within a unified framework the main contemporary approaches and assumptions. He concentrates on giving clear mathematical formulations, analyses, justifications, and consequences of the main theories about uncertain reasoning.
Epistemic Logic for Artifcl Intell book cover
#26

Epistemic Logic for Artifcl Intell

1995

Epistemic logic has grown from its philosophical beginnings to find diverse applications in computer science as a means of reasoning about the knowledge and belief of agents. This book, based on courses taught at universities and summer schools, provides a broad introduction to the subject; many exercises are included together with their solutions. The authors begin by presenting the necessary apparatus from mathematics and logic, including Kripke semantics and the well-known modal logics K, T, S4 and S5. Then they turn to applications in the contexts of distributed systems and artificial intelligence: topics that are addressed include the notions of common knowledge, distributed knowledge, explicit and implicit belief, the interplays between knowledge and time, and knowledge and action, as well as a graded (or numerical) variant of the epistemic operators. The problem of logical omniscience is also discussed extensively. Halpern and Moses' theory of honest formulae is covered, and a digression is made into the realm of non-monotonic reasoning and preferential entailment. Moore's autoepistemic logic is discussed, together with Levesque's related logic of 'all I know'. Furthermore, it is shown how one can base default and counterfactual reasoning on epistemic logic.
Information Flow book cover
#27

Information Flow

The Logic of Distributed Systems

1997

Information is a central topic in computer science, cognitive science, and philosophy. In spite of its importance in the "information age," there is no consensus on what information is, what makes it possible, and what it means for one medium to carry information about another. Drawing on ideas from mathematics, computer science, and philosophy, this book addresses the definition and place of information in society. The authors, observing that information flow is possible only within a connected distribution system, provide a mathematically rigorous, philosophically sound foundation for a science of information. They illustrate their theory by applying it to a wide range of phenomena, from file transfer to DNA, from quantum mechanics to speech act theory.
Domains and Lambda-Calculi book cover
#29

Domains and Lambda-Calculi

1998

This book describes the mathematical aspects of the semantics of programming languages. The main goals are to provide formal tools to assess the meaning of programming constructs in both a language-independent and a machine-independent way and to prove properties about programs, such as whether they terminate, or whether their result is a solution of the problem they are supposed to solve. In order to achieve this the authors first present, in an elementary and unified way, the theory of certain topological spaces that have proved of use in the modeling of various families of typed lambda calculi considered as core programming languages and as meta-languages for denotational semantics. This theory is now known as Domain Theory, and was founded as a subject by Scott and Plotkin. One of the main concerns is to establish links between mathematical structures and more syntactic approaches to semantics, often referred to as operational semantics, which is also described. This dual approach has the double advantage of motivating computer scientists to do some mathematics and of interesting mathematicians in unfamiliar application areas from computer science.
Data Refinement book cover
#30

Data Refinement

Model-Oriented Proof Methods and their Comparison

1998

The goal of this book is to provide a comprehensive and systematic introduction to the important and highly applicable method of data refinement and the simulation methods used for proving its correctness. The authors concentrate in the first part on the general principles needed to prove data refinement correct. They begin with an explanation of the fundamental notions, showing that data refinement proofs reduce to proving simulation. The topics of Hoare Logic and the Refinement Calculus are introduced and a general theory of simulations is developed and related to them. Accessibility and comprehension are emphasized in order to guide newcomers to the area. The book's second part contains a detailed survey of important methods in this field, such as VDM, and the methods due to Abadi & Lamport, Hehner, Lynch and Reynolds, Back's refinement calculus and Z. All these methods are carefully analysed, and shown to be either imcomplete, with counterexamples to their application, or to be always applicable whenever data refinement holds. This is shown by proving, for the first time, that all these methods can be described and analyzed in terms of two simple notions: forward and backward simulation. The book is self-contained, going from advanced undergraduate level and taking the reader to the state of the art in methods for proving simulation.
Hybrid Graph Theory and Network Analysis book cover
#31

Hybrid Graph Theory and Network Analysis

1999

This book combines traditional graph theory with the matroid view of graphs in order to throw light on the mathematical approach to network analysis. The authors examine in detail two dual structures associated with a graph, namely circuits and cutsets. These are strongly dependent on one another and together constitute a third, hybrid, vertex-independent structure called a graphoid, whose study is here termed hybrid graph theory. This approach has particular relevance for network analysis. The first account of the subject in book form, the text includes many new results as well as the synthesizing and reworking of much research done over the past thirty years (historically, the study of hybrid aspects of graphs owes much to the foundational work of Japanese researchers). This work will be regarded as the definitive account of the subject, suitable for all working in theoretical network mathematicians, computer scientists or electrical engineers.
Rippling book cover
#34

Rippling

Meta-Level Guidance for Mathematical Reasoning

2004

The automation of mathematical reasoning has been an important topic of research almost since computers were invented. The new technique of rippling, described here for the first time in book form, is designed to be an approach to mathematical reasoning that takes into account ideas of heuristics and searching. Rippling addresses the problem of combinatorial explosion which has proved a huge obstacle in the past, and the book offers a systematic and comprehensive introduction to this and to the wider subject of automated inductive theorem proving.
Advanced Topics in Bisimulation and Coinduction book cover
#36

Advanced Topics in Bisimulation and Coinduction

2011

Coinduction is a method for specifying and reasoning about infinite data types and automata with infinite behaviour. In recent years, it has come to play an ever more important role in the theory of computing. It is studied in many disciplines, including process theory and concurrency, modal logic and automata theory. Typically, coinductive proofs demonstrate the equivalence of two objects by constructing a suitable bisimulation relation between them. This collection of surveys is aimed at both researchers and Master's students in computer science and mathematics and deals with various aspects of bisimulation and coinduction, with an emphasis on process theory. Seven chapters cover the following history, algebra and coalgebra, algorithmics, logic, higher-order languages, enhancements of the bisimulation proof method, and probabilities. Exercises are also included to help the reader master new material.
Nominal Sets book cover
#37

Nominal Sets

Names and Symmetry in Computer Science

2013

Nominal sets provide a promising new mathematical analysis of names in formal languages based upon symmetry, with many applications to the syntax and semantics of programming language constructs that involve binding, or localising names. Part I provides an introduction to the basic theory of nominal sets. In Part II, the author surveys some of the applications that have developed in programming language semantics (both operational and denotational), functional programming and logic programming. As the first book to give a detailed account of the theory of nominal sets, it will be welcomed by researchers and graduate students in theoretical computer science.

Authors

James Clifford
James Clifford
Author · 6 books

James Clifford is a historian and Professor in the History of Consciousness Department at the University of California, Santa Cruz. Clifford and Hayden White were among the first faculty directly appointed to the History of Consciousness Ph.D. program in 1978, which was originally the only graduate department at UC-Santa Cruz. The History of Consciousness department continues to be an intellectual center for innovative interdisciplinary and critical scholarship in the U.S. and abroad, largely due to Clifford and White's influence, as well as the work of other prominent faculty who were hired in the 1980’s. Clifford served as Chair to this department from 2004-2007. Clifford is the author of several widely cited and translated books, including The Predicament of Culture: Twentieth Century Ethnography, Literature and Art (1988) and Routes: Travel and Translation in the Late 20th Century (1997), as well as the editor of Writing Culture: the Poetics and Politics of Ethnography, with George Marcus (1986). Clifford's work has sparked controversy and critical debate in a number of disciplines, such as literature, art history and visual studies, and especially in cultural anthropology, as his literary critiques of written ethnography greatly contributed to the discipline’s important self-critical period of the 1980's and early 1990's. Clifford's dissertation research was conducted at Harvard University in History (1969-1977), and focused on anthropologist Maurice Leenhardt and Melanesia. However, because of his impact on the discipline of anthropology, Clifford is sometimes mistaken as an anthropologist with graduate training in cultural anthropology. Rather, Clifford's work in anthropology is usually critical and historical in nature, and does not often include fieldwork or extended research at a single field site. A geographical interest in Melanesia continues to influence Clifford's scholarship, and his work on issues related to indigeneity, as well as fields like globalization, museum studies, visual and performance studies, cultural studies, and translation, often as they relate to how the category of the indigenous is produced.

Alan Gibbons
Alan Gibbons
Author · 28 books

Alan Gibbons is an author of children's books and a Blue Peter Book Award. He currently lives in Liverpool, England, where he used to teach in a primary school. His father was a farm laborer, but was hurt in an accident when Alan was eight years old. The family had to move to Crewe, Cheshire where Alan experienced bullying for the first time. He began to write for his pupils as a teacher, but never tried to get any of his work published. Gibbons trained to be a teacher in his mid-thirties and starting writing short stories for his students. Later, he began to write professionally. In 2000, he won the Blue Peter Book Award in the category "The Book I Couldn't Put Down" category for Shadow of the Minotaur. He was a judge for the 2001 Blue Peter Book Awards. He was shortlisted for the Carnegie Medal in 2001 and 2003 and shortlisted twice for the Booktrust Teenage Prize. He has also won the Leicester Book of the Year, the Stockport Book Award, the Angus Book Award, the Catalyst Award, the Birmingham Chills Award, the Salford Young Adult Book Award and the Salford Librarians' Special Award.

548 Market St PMB 65688, San Francisco California 94104-5401 USA
© 2025 Paratext Inc. All rights reserved