Jon Williamson
- Published in print:
- 2004
- Published Online:
- September 2007
- ISBN:
- 9780198530794
- eISBN:
- 9780191712982
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198530794.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This book provides an introduction to, and analysis of, the use of Bayesian nets in causal modelling. It puts forward new conceptual foundations for causal network modelling: The book argues that ...
More
This book provides an introduction to, and analysis of, the use of Bayesian nets in causal modelling. It puts forward new conceptual foundations for causal network modelling: The book argues that probability and causality need to be interpreted as epistemic notions in order for the key assumptions behind causal models to hold. Under the epistemic view, probability and causality are understood in terms of the beliefs an agent ought to adopt. The book develops an objective Bayesian notion of probability and a corresponding epistemic theory of causality. This yields a general framework for causal modelling, which is extended to cope with recursive causal relations, logically complex beliefs and changes in an agent's language.Less
This book provides an introduction to, and analysis of, the use of Bayesian nets in causal modelling. It puts forward new conceptual foundations for causal network modelling: The book argues that probability and causality need to be interpreted as epistemic notions in order for the key assumptions behind causal models to hold. Under the epistemic view, probability and causality are understood in terms of the beliefs an agent ought to adopt. The book develops an objective Bayesian notion of probability and a corresponding epistemic theory of causality. This yields a general framework for causal modelling, which is extended to cope with recursive causal relations, logically complex beliefs and changes in an agent's language.
Ettore Casari
- Published in print:
- 2016
- Published Online:
- January 2017
- ISBN:
- 9780198788294
- eISBN:
- 9780191830228
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198788294.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
A starting point of Bolzano’s logical reflection was the conviction that among truths there is a connection, according to which some truths are grounds of others, and these in turn are consequences ...
More
A starting point of Bolzano’s logical reflection was the conviction that among truths there is a connection, according to which some truths are grounds of others, and these in turn are consequences of the former, and that such a connection is objective, i.e. subsisting independently of every cognitive activity of the subject. In the attempt to account for the distinction between subjective and objective levels of knowledge, Bolzano gradually gained the conviction that the reference of the subject to the object is mediated by a realm of entities without existence that, recalling the Stoic lectà, are here called ‘lectological’. Moreover, of the two main ways through which that reference takes place—psychic activity and linguistic activity—Bolzano favoured the first and traced back to it the problems of the second; i.e. he considered those intermediate entities first as possible content of psychic phenomena and only subordinately, on the basis of a complex theory of signs, as meanings of linguistic phenomena. This book follows this schema and treats, in great detail, first, lectological entities (ideas and propositions in themselves), second, cognitive psychic phenomena (subjective ideas and judgements), and, finally, linguistic phenomena. Moreover, it tries to bring to light the extraordinary systematic character of Bolzano’s logical thought and it does this showing that the main logical ideas developed principally in the first three parts of the Theory of Science, published in 1837, can be effortlessly formally presented within the well-known Hilbertian epsilon-calculus.Less
A starting point of Bolzano’s logical reflection was the conviction that among truths there is a connection, according to which some truths are grounds of others, and these in turn are consequences of the former, and that such a connection is objective, i.e. subsisting independently of every cognitive activity of the subject. In the attempt to account for the distinction between subjective and objective levels of knowledge, Bolzano gradually gained the conviction that the reference of the subject to the object is mediated by a realm of entities without existence that, recalling the Stoic lectà, are here called ‘lectological’. Moreover, of the two main ways through which that reference takes place—psychic activity and linguistic activity—Bolzano favoured the first and traced back to it the problems of the second; i.e. he considered those intermediate entities first as possible content of psychic phenomena and only subordinately, on the basis of a complex theory of signs, as meanings of linguistic phenomena. This book follows this schema and treats, in great detail, first, lectological entities (ideas and propositions in themselves), second, cognitive psychic phenomena (subjective ideas and judgements), and, finally, linguistic phenomena. Moreover, it tries to bring to light the extraordinary systematic character of Bolzano’s logical thought and it does this showing that the main logical ideas developed principally in the first three parts of the Theory of Science, published in 1837, can be effortlessly formally presented within the well-known Hilbertian epsilon-calculus.
Phyllis McKay Illari, Federica Russo, and Jon Williamson (eds)
- Published in print:
- 2011
- Published Online:
- September 2011
- ISBN:
- 9780199574131
- eISBN:
- 9780191728921
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199574131.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
There is a need for integrated thinking about causality, probability, and mechanism in scientific methodology. A panoply of disciplines, ranging from epidemiology and biology through to econometrics ...
More
There is a need for integrated thinking about causality, probability, and mechanism in scientific methodology. A panoply of disciplines, ranging from epidemiology and biology through to econometrics and physics, routinely make use of these concepts to infer causal relationships. But each of these disciplines has developed its own methods, where causality and probability often seem to have different understandings, and where the mechanisms involved often look very different. This variegated situation raises the question of whether progress in understanding the tools of causal inference in some sciences can lead to progress in other sciences, or whether the sciences are really using different concepts. Causality and probability are long-established central concepts in the sciences, with a corresponding philosophical literature examining their problems. The philosophical literature examining the concept of mechanism, on the other hand, is more recent and there has been no clear account of how mechanisms relate to causality and probability. If we are to understand causal inference in the sciences, we need to develop some account of the relationship between causality, probability, and mechanism. This book represents a joint project by philosophers and scientists to tackle this question, and related issues, as they arise in a wide variety of disciplines across the sciences.Less
There is a need for integrated thinking about causality, probability, and mechanism in scientific methodology. A panoply of disciplines, ranging from epidemiology and biology through to econometrics and physics, routinely make use of these concepts to infer causal relationships. But each of these disciplines has developed its own methods, where causality and probability often seem to have different understandings, and where the mechanisms involved often look very different. This variegated situation raises the question of whether progress in understanding the tools of causal inference in some sciences can lead to progress in other sciences, or whether the sciences are really using different concepts. Causality and probability are long-established central concepts in the sciences, with a corresponding philosophical literature examining their problems. The philosophical literature examining the concept of mechanism, on the other hand, is more recent and there has been no clear account of how mechanisms relate to causality and probability. If we are to understand causal inference in the sciences, we need to develop some account of the relationship between causality, probability, and mechanism. This book represents a joint project by philosophers and scientists to tackle this question, and related issues, as they arise in a wide variety of disciplines across the sciences.
André Nies
- Published in print:
- 2009
- Published Online:
- May 2009
- ISBN:
- 9780199230761
- eISBN:
- 9780191710988
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199230761.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
The complexity and the randomness aspect of a set of natural numbers are closely related. Traditionally, computability theory is concerned with the complexity aspect. However, computability theoretic ...
More
The complexity and the randomness aspect of a set of natural numbers are closely related. Traditionally, computability theory is concerned with the complexity aspect. However, computability theoretic tools can also be used to introduce mathematical counterparts for the intuitive notion of randomness of a set. Recent research shows that, conversely, concepts and methods originating from randomness enrich computability theory. The book is about these two aspects of sets of natural numbers and about their interplay. For the first aspect, lowness and highness properties of sets are introduced. For the second aspect, firstly randomness of finite objects are studied, and then randomness of sets of natural numbers. A hierarchy of mathematical randomness notions is established. Each notion matches the intuition idea of randomness to some extent. The advantages and drawbacks of notions weaker and stronger than Martin-Löf randomness are discussed. The main topic is the interplay of the computability and randomness aspects. Research on this interplay has advanced rapidly in recent years. One chapter focuses on injury-free solutions to Post's problem. A core chapter contains a comprehensible treatment of lowness properties below the halting problem, and how they relate to K triviality. Each chapter exposes how the complexity properties are related to randomness. The book also contains analogs in the area of higher computability theory of results from the preceding chapters, reflecting very recent research.Less
The complexity and the randomness aspect of a set of natural numbers are closely related. Traditionally, computability theory is concerned with the complexity aspect. However, computability theoretic tools can also be used to introduce mathematical counterparts for the intuitive notion of randomness of a set. Recent research shows that, conversely, concepts and methods originating from randomness enrich computability theory. The book is about these two aspects of sets of natural numbers and about their interplay. For the first aspect, lowness and highness properties of sets are introduced. For the second aspect, firstly randomness of finite objects are studied, and then randomness of sets of natural numbers. A hierarchy of mathematical randomness notions is established. Each notion matches the intuition idea of randomness to some extent. The advantages and drawbacks of notions weaker and stronger than Martin-Löf randomness are discussed. The main topic is the interplay of the computability and randomness aspects. Research on this interplay has advanced rapidly in recent years. One chapter focuses on injury-free solutions to Post's problem. A core chapter contains a comprehensible treatment of lowness properties below the halting problem, and how they relate to K triviality. Each chapter exposes how the complexity properties are related to randomness. The book also contains analogs in the area of higher computability theory of results from the preceding chapters, reflecting very recent research.
Antti Oulasvirta, Per Ola Kristensson, Xiaojun Bi, and Andrew Howes (eds)
- Published in print:
- 2018
- Published Online:
- March 2018
- ISBN:
- 9780198799603
- eISBN:
- 9780191839832
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198799603.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This book presents computational interaction as an approach to explaining and enhancing the interaction between humans and information technology. Computational interaction applies abstraction, ...
More
This book presents computational interaction as an approach to explaining and enhancing the interaction between humans and information technology. Computational interaction applies abstraction, automation, and analysis to inform our understanding of the structure of interaction and also to inform the design of the software that drives new and exciting human-computer interfaces. The methods of computational interaction allow, for example, designers to identify user interfaces that are optimal against some objective criteria. They also allow software engineers to build interactive systems that adapt their behaviour to better suit individual capacities and preferences. Embedded in an iterative design process, computational interaction has the potential to complement human strengths and provide methods for generating inspiring and elegant designs. Computational interaction does not exclude the messy and complicated behaviour of humans, rather it embraces it by, for example, using models that are sensitive to uncertainty and that capture subtle variations between individual users. It also promotes the idea that there are many aspects of interaction that can be augmented by algorithms. This book introduces computational interaction design to the reader by exploring a wide range of computational interaction techniques, strategies and methods. It explains how techniques such as optimisation, economic modelling, machine learning, control theory, formal methods, cognitive models and statistical language processing can be used to model interaction and design more expressive, efficient and versatile interaction.Less
This book presents computational interaction as an approach to explaining and enhancing the interaction between humans and information technology. Computational interaction applies abstraction, automation, and analysis to inform our understanding of the structure of interaction and also to inform the design of the software that drives new and exciting human-computer interfaces. The methods of computational interaction allow, for example, designers to identify user interfaces that are optimal against some objective criteria. They also allow software engineers to build interactive systems that adapt their behaviour to better suit individual capacities and preferences. Embedded in an iterative design process, computational interaction has the potential to complement human strengths and provide methods for generating inspiring and elegant designs. Computational interaction does not exclude the messy and complicated behaviour of humans, rather it embraces it by, for example, using models that are sensitive to uncertainty and that capture subtle variations between individual users. It also promotes the idea that there are many aspects of interaction that can be augmented by algorithms. This book introduces computational interaction design to the reader by exploring a wide range of computational interaction techniques, strategies and methods. It explains how techniques such as optimisation, economic modelling, machine learning, control theory, formal methods, cognitive models and statistical language processing can be used to model interaction and design more expressive, efficient and versatile interaction.
William B. Rouse
- Published in print:
- 2019
- Published Online:
- September 2019
- ISBN:
- 9780198846420
- eISBN:
- 9780191881589
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198846420.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This book discusses the use of models and interactive visualizations to explore designs of systems and policies in determining whether such designs would be effective. Executives and senior managers ...
More
This book discusses the use of models and interactive visualizations to explore designs of systems and policies in determining whether such designs would be effective. Executives and senior managers are very interested in what “data analytics” can do for them and, quite recently, what the prospects are for artificial intelligence and machine learning. They want to understand and then invest wisely. They are reasonably skeptical, having experienced overselling and under-delivery. They ask about reasonable and realistic expectations. Their concern is with the futurity of decisions they are currently entertaining. They cannot fully address this concern empirically. Thus, they need some way to make predictions. The problem is that one rarely can predict exactly what will happen, only what might happen. To overcome this limitation, executives can be provided predictions of possible futures and the conditions under which each scenario is likely to emerge. Models can help them to understand these possible futures. Most executives find such candor refreshing, perhaps even liberating. Their job becomes one of imagining and designing a portfolio of possible futures, assisted by interactive computational models. Understanding and managing uncertainty is central to their job. Indeed, doing this better than competitors is a hallmark of success. This book is intended to help them understand what fundamentally needs to be done, why it needs to be done, and how to do it. The hope is that readers will discuss this book and develop a “shared mental model” of computational modeling in the process, which will greatly enhance their chances of success.Less
This book discusses the use of models and interactive visualizations to explore designs of systems and policies in determining whether such designs would be effective. Executives and senior managers are very interested in what “data analytics” can do for them and, quite recently, what the prospects are for artificial intelligence and machine learning. They want to understand and then invest wisely. They are reasonably skeptical, having experienced overselling and under-delivery. They ask about reasonable and realistic expectations. Their concern is with the futurity of decisions they are currently entertaining. They cannot fully address this concern empirically. Thus, they need some way to make predictions. The problem is that one rarely can predict exactly what will happen, only what might happen. To overcome this limitation, executives can be provided predictions of possible futures and the conditions under which each scenario is likely to emerge. Models can help them to understand these possible futures. Most executives find such candor refreshing, perhaps even liberating. Their job becomes one of imagining and designing a portfolio of possible futures, assisted by interactive computational models. Understanding and managing uncertainty is central to their job. Indeed, doing this better than competitors is a hallmark of success. This book is intended to help them understand what fundamentally needs to be done, why it needs to be done, and how to do it. The hope is that readers will discuss this book and develop a “shared mental model” of computational modeling in the process, which will greatly enhance their chances of success.
Andreas Bolfing
- Published in print:
- 2020
- Published Online:
- October 2020
- ISBN:
- 9780198862840
- eISBN:
- 9780191895463
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198862840.001.0001
- Subject:
- Mathematics, Computational Mathematics / Optimization, Logic / Computer Science / Mathematical Philosophy
Many online applications, especially in the financial industries, are running on blockchain technologies in a decentralized manner, without the use of an authoritative entity or a trusted third ...
More
Many online applications, especially in the financial industries, are running on blockchain technologies in a decentralized manner, without the use of an authoritative entity or a trusted third party. Such systems are only secured by cryptographic protocols and a consensus mechanism. As blockchain-based solutions will continue to revolutionize online applications in a growing digital market in the future, one needs to identify the principal opportunities and potential risks. Hence, it is unavoidable to learn the mathematical and cryptographic procedures behind blockchain technology in order to understand how such systems work and where the weak points are. The book provides an introduction to the mathematical and cryptographic concepts behind blockchain technologies and shows how they are applied in blockchain-based systems. This includes an introduction to the general blockchain technology approaches that are used to build the so-called immutable ledgers, which are based on cryptographic signature schemes. As future quantum computers will break some of the current cryptographic primitive approaches, the book considers their security and presents the current research results that estimate the impact on blockchain-based systems if some of the cryptographic primitive break. Based on the example of Bitcoin, it shows that weak cryptographic primitives pose a possible danger for the ledger, which can be overcome through the use of the so-called post-quantum cryptographic approaches which are introduced as well.Less
Many online applications, especially in the financial industries, are running on blockchain technologies in a decentralized manner, without the use of an authoritative entity or a trusted third party. Such systems are only secured by cryptographic protocols and a consensus mechanism. As blockchain-based solutions will continue to revolutionize online applications in a growing digital market in the future, one needs to identify the principal opportunities and potential risks. Hence, it is unavoidable to learn the mathematical and cryptographic procedures behind blockchain technology in order to understand how such systems work and where the weak points are. The book provides an introduction to the mathematical and cryptographic concepts behind blockchain technologies and shows how they are applied in blockchain-based systems. This includes an introduction to the general blockchain technology approaches that are used to build the so-called immutable ledgers, which are based on cryptographic signature schemes. As future quantum computers will break some of the current cryptographic primitive approaches, the book considers their security and presents the current research results that estimate the impact on blockchain-based systems if some of the cryptographic primitive break. Based on the example of Bitcoin, it shows that weak cryptographic primitives pose a possible danger for the ledger, which can be overcome through the use of the so-called post-quantum cryptographic approaches which are introduced as well.
Steven J. Osterlind
- Published in print:
- 2019
- Published Online:
- January 2019
- ISBN:
- 9780198831600
- eISBN:
- 9780191869532
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198831600.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
The Error of Truth recounts the astonishing and unexpected tale of how quantitative thinking was invented and rose to primacy in our lives in the nineteenth and early twentieth centuries, bringing us ...
More
The Error of Truth recounts the astonishing and unexpected tale of how quantitative thinking was invented and rose to primacy in our lives in the nineteenth and early twentieth centuries, bringing us to an entirely new perspective on what we know about the world and how we know it—even on what we each think about ourselves. Quantitative thinking is our inclination to view natural and everyday phenomena through a lens of measurable events, with forecasts, odds, predictions, and likelihood playing a dominant part. This worldview, or Weltanschauung, is unlike anything humankind had before, and it came about because of a momentous human achievement: namely, we had learned how to measure uncertainty. Probability as a science had been invented. Through probability theory, we now had correlations, reliable predictions, regressions, the bell-shaped curve for studying social phenomena, and the psychometrics of educational testing. Significantly, these developments in mathematics happened during a relatively short period in world history: roughly, the 130-year period from 1790 to 1920, from about the close of the Napoleonic era, through the Enlightenment and the Industrial Revolutions, to the end of World War I. Quantification is now everywhere in our daily lives, such as in the ubiquitous microchip in smartphones, cars, and appliances, in the Bayesian logic of artificial intelligence, and in applications in business, engineering, medicine, economics, and elsewhere. Probability is the foundation of our quantitative thinking. Here we see its story: when, why, and how it came to be and changed us forever.Less
The Error of Truth recounts the astonishing and unexpected tale of how quantitative thinking was invented and rose to primacy in our lives in the nineteenth and early twentieth centuries, bringing us to an entirely new perspective on what we know about the world and how we know it—even on what we each think about ourselves. Quantitative thinking is our inclination to view natural and everyday phenomena through a lens of measurable events, with forecasts, odds, predictions, and likelihood playing a dominant part. This worldview, or Weltanschauung, is unlike anything humankind had before, and it came about because of a momentous human achievement: namely, we had learned how to measure uncertainty. Probability as a science had been invented. Through probability theory, we now had correlations, reliable predictions, regressions, the bell-shaped curve for studying social phenomena, and the psychometrics of educational testing. Significantly, these developments in mathematics happened during a relatively short period in world history: roughly, the 130-year period from 1790 to 1920, from about the close of the Napoleonic era, through the Enlightenment and the Industrial Revolutions, to the end of World War I. Quantification is now everywhere in our daily lives, such as in the ubiquitous microchip in smartphones, cars, and appliances, in the Bayesian logic of artificial intelligence, and in applications in business, engineering, medicine, economics, and elsewhere. Probability is the foundation of our quantitative thinking. Here we see its story: when, why, and how it came to be and changed us forever.
Keith Martin
- Published in print:
- 2017
- Published Online:
- July 2017
- ISBN:
- 9780198788003
- eISBN:
- 9780191829956
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198788003.001.0001
- Subject:
- Mathematics, Computational Mathematics / Optimization, Logic / Computer Science / Mathematical Philosophy
Cryptography is a vital technology that underpins the security of information in computer networks. This book presents a comprehensive introduction to the role that cryptography plays in providing ...
More
Cryptography is a vital technology that underpins the security of information in computer networks. This book presents a comprehensive introduction to the role that cryptography plays in providing information security for technologies such as the Internet, mobile phones, payment cards, and wireless local area networks. Focusing on the fundamental principles that ground modern cryptography as they arise in modern applications, it avoids both an over-reliance on transient technologies and overwhelming theoretical research. The first part of the book provides essential background, identifying the core security services provided by cryptography. The next part introduces the main cryptographic mechanisms that deliver these security services such as encryption, hash functions, and digital signatures, discussing why they work and how to deploy them, without delving into any significant mathematical detail. In the third part, the important practical aspects of key management are introduced, which is essential for making cryptography work in real systems. The last part considers the application of cryptography. A range of application case studies is presented, alongside a discussion of the wider societal issues arising from use of cryptography to support contemporary cyber security.Less
Cryptography is a vital technology that underpins the security of information in computer networks. This book presents a comprehensive introduction to the role that cryptography plays in providing information security for technologies such as the Internet, mobile phones, payment cards, and wireless local area networks. Focusing on the fundamental principles that ground modern cryptography as they arise in modern applications, it avoids both an over-reliance on transient technologies and overwhelming theoretical research. The first part of the book provides essential background, identifying the core security services provided by cryptography. The next part introduces the main cryptographic mechanisms that deliver these security services such as encryption, hash functions, and digital signatures, discussing why they work and how to deploy them, without delving into any significant mathematical detail. In the third part, the important practical aspects of key management are introduced, which is essential for making cryptography work in real systems. The last part considers the application of cryptography. A range of application case studies is presented, alongside a discussion of the wider societal issues arising from use of cryptography to support contemporary cyber security.
Laura Crosilla and Peter Schuster (eds)
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198566519
- eISBN:
- 9780191713927
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198566519.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
Constructive mathematics is a vital area of research which has gained special attention in recent years due to the distinctive presence of computational content in its theorems. This characteristic ...
More
Constructive mathematics is a vital area of research which has gained special attention in recent years due to the distinctive presence of computational content in its theorems. This characteristic had been already stressed by Bishop in his fundamental contribution to the subject, Foundations of Constructive Analysis (1967). Following Bishop's new approach to mathematics based on intuitionistic logic, various formal systems were introduced in the early 1970s with the intent to clarify the notion of set theory underlying his work. This book addresses the relationship between foundations and practice of constructive mathematics Bishop-style, by presenting on the one hand some very recent contributions to constructive analysis and formal topology, and on the other hand studies which underline the capabilities and expressiveness of various formal systems which have been introduced as foundations for constructive mathematics, like constructive set and type theories. The book aims to provide a point of reference by pesenting up-to-date contributions by some of the most active scholars in each field. A variety of approaches and techniques are represented to give as wide a view as possible and promote cross-fertilization between different styles and traditions. The book also aims at further promoting awareness and discussion on the issue of bridging foundations and practice of constructive mathematics, thus filling the apparent distance that has emerged between them in recent years.Less
Constructive mathematics is a vital area of research which has gained special attention in recent years due to the distinctive presence of computational content in its theorems. This characteristic had been already stressed by Bishop in his fundamental contribution to the subject, Foundations of Constructive Analysis (1967). Following Bishop's new approach to mathematics based on intuitionistic logic, various formal systems were introduced in the early 1970s with the intent to clarify the notion of set theory underlying his work. This book addresses the relationship between foundations and practice of constructive mathematics Bishop-style, by presenting on the one hand some very recent contributions to constructive analysis and formal topology, and on the other hand studies which underline the capabilities and expressiveness of various formal systems which have been introduced as foundations for constructive mathematics, like constructive set and type theories. The book aims to provide a point of reference by pesenting up-to-date contributions by some of the most active scholars in each field. A variety of approaches and techniques are represented to give as wide a view as possible and promote cross-fertilization between different styles and traditions. The book also aims at further promoting awareness and discussion on the issue of bridging foundations and practice of constructive mathematics, thus filling the apparent distance that has emerged between them in recent years.
Frank C. Zagare
- Published in print:
- 2019
- Published Online:
- February 2019
- ISBN:
- 9780198831587
- eISBN:
- 9780191869525
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198831587.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy, Applied Mathematics
The main purpose of this book is to demonstrate, by way of example, the several advantages of using a formal game-theoretic framework to explain complex events, diplomatic history, and contentious ...
More
The main purpose of this book is to demonstrate, by way of example, the several advantages of using a formal game-theoretic framework to explain complex events, diplomatic history, and contentious interstate relationships, via causal mechanisms and rationality. Chapter 1 lays out the broad parameters and major concepts of the mathematical theory of games and its applications in the security studies literature. Chapter 2 explores a number of issues connected with the use of game-theoretic models to organize analytic narratives, both generally and specifically. Chapter 3 interprets the Moroccan crisis of 1905–6 in the context of an incomplete information game model. Chapter 4 surveys and evaluates several prominent attempts to use game theory to explain the strategic dynamic of the Cuban missile crisis of 1962. Chapter 5 offers a general explanation that answers all of the foundational questions associated with the Cuban crisis within the confines of a single, integrated, game-theoretic model with incomplete information. Chapter 6 uses the same game form to develop a logically consistent and empirically plausible explanation of the outbreak of war in Europe in early August 1914. Chapter 7 introduces perfect deterrence theory and contrasts it with the prevailing realist theory of interstate war prevention, and classical deterrence theory. Chapter 8 addresses the charge made by some behavioral economists (and many strategic analysts) that game theory is of limited utility for understanding interstate conflict behavior.Less
The main purpose of this book is to demonstrate, by way of example, the several advantages of using a formal game-theoretic framework to explain complex events, diplomatic history, and contentious interstate relationships, via causal mechanisms and rationality. Chapter 1 lays out the broad parameters and major concepts of the mathematical theory of games and its applications in the security studies literature. Chapter 2 explores a number of issues connected with the use of game-theoretic models to organize analytic narratives, both generally and specifically. Chapter 3 interprets the Moroccan crisis of 1905–6 in the context of an incomplete information game model. Chapter 4 surveys and evaluates several prominent attempts to use game theory to explain the strategic dynamic of the Cuban missile crisis of 1962. Chapter 5 offers a general explanation that answers all of the foundational questions associated with the Cuban crisis within the confines of a single, integrated, game-theoretic model with incomplete information. Chapter 6 uses the same game form to develop a logically consistent and empirically plausible explanation of the outbreak of war in Europe in early August 1914. Chapter 7 introduces perfect deterrence theory and contrasts it with the prevailing realist theory of interstate war prevention, and classical deterrence theory. Chapter 8 addresses the charge made by some behavioral economists (and many strategic analysts) that game theory is of limited utility for understanding interstate conflict behavior.
Anders Drachen, Pejman Mirza-Babaei, and Lennart Nacke (eds)
- Published in print:
- 2018
- Published Online:
- March 2018
- ISBN:
- 9780198794844
- eISBN:
- 9780191836336
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198794844.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy, Computational Mathematics / Optimization
Today, Games User Research forms an integral component of the development of any kind of interactive entertainment. User research stands as the primary source of business intelligence in the ...
More
Today, Games User Research forms an integral component of the development of any kind of interactive entertainment. User research stands as the primary source of business intelligence in the incredibly competitive game industry. This book aims to provide the foundational, accessible, go-to resource for people interested in GUR. It is a community-driven effort—it is written by passionate professionals and researchers in the GUR community as a handbook and guide for everyone interested in user research and games. The book bridges the current gaps of knowledge in Game User Research, building the go-to volume for everyone working with games, with an emphasis on those new to the field.Less
Today, Games User Research forms an integral component of the development of any kind of interactive entertainment. User research stands as the primary source of business intelligence in the incredibly competitive game industry. This book aims to provide the foundational, accessible, go-to resource for people interested in GUR. It is a community-driven effort—it is written by passionate professionals and researchers in the GUR community as a handbook and guide for everyone interested in user research and games. The book bridges the current gaps of knowledge in Game User Research, building the go-to volume for everyone working with games, with an emphasis on those new to the field.
Leon Horsten and Philip Welch (eds)
- Published in print:
- 2016
- Published Online:
- November 2016
- ISBN:
- 9780198759591
- eISBN:
- 9780191820373
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198759591.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
The logician Kurt Gödel in 1951 established a disjunctive thesis about the scope and limits of mathematical knowledge: either the mathematical mind is equivalent to a Turing machine (i.e., a ...
More
The logician Kurt Gödel in 1951 established a disjunctive thesis about the scope and limits of mathematical knowledge: either the mathematical mind is equivalent to a Turing machine (i.e., a computer) or there are absolutely undecidable mathematical problems. In the second half of the twentieth century, attempts have been made to arrive at a stronger conclusion. In particular, arguments have been produced by the philosopher J.R. Lucas and by the physicist and mathematician Roger Penrose that intend to show that the mathematicalmind ismore powerful than any computer. These arguments, and counterarguments to them, have not convinced the logical and philosophical community. The reason for this is an insufficiency of rigour in the debate. The contributions in this volume move the debate forward by formulating rigorous frameworks and formally spelling out and evaluating arguments that bear on Gödel’s disjunction in these frameworks. The contributions in this volume have been written by world leading experts in the field.Less
The logician Kurt Gödel in 1951 established a disjunctive thesis about the scope and limits of mathematical knowledge: either the mathematical mind is equivalent to a Turing machine (i.e., a computer) or there are absolutely undecidable mathematical problems. In the second half of the twentieth century, attempts have been made to arrive at a stronger conclusion. In particular, arguments have been produced by the philosopher J.R. Lucas and by the physicist and mathematician Roger Penrose that intend to show that the mathematicalmind ismore powerful than any computer. These arguments, and counterarguments to them, have not convinced the logical and philosophical community. The reason for this is an insufficiency of rigour in the debate. The contributions in this volume move the debate forward by formulating rigorous frameworks and formally spelling out and evaluating arguments that bear on Gödel’s disjunction in these frameworks. The contributions in this volume have been written by world leading experts in the field.
Olle Häggström
- Published in print:
- 2016
- Published Online:
- January 2016
- ISBN:
- 9780198723547
- eISBN:
- 9780191790331
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198723547.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This book challenges the widely held but oversimplified and even dangerous conception that progress in science and technology is our salvation, and the more of it, the better. The future will offer ...
More
This book challenges the widely held but oversimplified and even dangerous conception that progress in science and technology is our salvation, and the more of it, the better. The future will offer huge changes due to such progress, but it is not certain that all changes will be for the better. The unprecedented rate of technological development that the 20th century witnessed has made our lives today vastly different from those in 1900. No slowdown is in sight, and the 21st century will most likely see even more revolutionary changes than the 20th, due to advances in science, technology and medicine. Areas where extraordinary and perhaps disruptive advances can be expected include biotechnology, nanotechnology and machine intelligence. We may also look forward to various ways to enhance human cognitive and other abilities using pharmaceuticals, genetic engineering or machine–brain interfaces—perhaps to the extent of changing human nature beyond what we currently think of as human, and into a posthuman era. The potential benefits of all these technologies are enormous, but so are the risks, including the possibility of human extinction. The currently dominant attitude towards scientific and technological advances is tantamount to running blindfold and at full speed into a minefield. This book is a passionate plea for doing our best to map the territories ahead of us, and for acting with foresight, so as to maximize our chances of reaping the benefits of the new technologies while avoiding the dangers.Less
This book challenges the widely held but oversimplified and even dangerous conception that progress in science and technology is our salvation, and the more of it, the better. The future will offer huge changes due to such progress, but it is not certain that all changes will be for the better. The unprecedented rate of technological development that the 20th century witnessed has made our lives today vastly different from those in 1900. No slowdown is in sight, and the 21st century will most likely see even more revolutionary changes than the 20th, due to advances in science, technology and medicine. Areas where extraordinary and perhaps disruptive advances can be expected include biotechnology, nanotechnology and machine intelligence. We may also look forward to various ways to enhance human cognitive and other abilities using pharmaceuticals, genetic engineering or machine–brain interfaces—perhaps to the extent of changing human nature beyond what we currently think of as human, and into a posthuman era. The potential benefits of all these technologies are enormous, but so are the risks, including the possibility of human extinction. The currently dominant attitude towards scientific and technological advances is tantamount to running blindfold and at full speed into a minefield. This book is a passionate plea for doing our best to map the territories ahead of us, and for acting with foresight, so as to maximize our chances of reaping the benefits of the new technologies while avoiding the dangers.
Rod Downey and Noam Greenberg
- Published in print:
- 2020
- Published Online:
- January 2021
- ISBN:
- 9780691199665
- eISBN:
- 9780691200217
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691199665.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
Computability theory is a branch of mathematical logic and computer science that has become increasingly relevant in recent years. The field has developed growing connections in diverse areas of ...
More
Computability theory is a branch of mathematical logic and computer science that has become increasingly relevant in recent years. The field has developed growing connections in diverse areas of mathematics, with applications in topology, group theory, and other subfields. This book introduces a new hierarchy that allows them to classify the combinatorics of constructions from many areas of computability theory, including algorithmic randomness, Turing degrees, effectively closed sets, and effective structure theory. This unifying hierarchy gives rise to new natural definability results for Turing degree classes, demonstrating how dynamic constructions become reflected in definability. The book presents numerous construction techniques involving high-level nonuniform arguments, and their self-contained work is appropriate for graduate students and researchers. Blending traditional and modern research results in computability theory, the book establishes novel directions in the field.Less
Computability theory is a branch of mathematical logic and computer science that has become increasingly relevant in recent years. The field has developed growing connections in diverse areas of mathematics, with applications in topology, group theory, and other subfields. This book introduces a new hierarchy that allows them to classify the combinatorics of constructions from many areas of computability theory, including algorithmic randomness, Turing degrees, effectively closed sets, and effective structure theory. This unifying hierarchy gives rise to new natural definability results for Turing degree classes, demonstrating how dynamic constructions become reflected in definability. The book presents numerous construction techniques involving high-level nonuniform arguments, and their self-contained work is appropriate for graduate students and researchers. Blending traditional and modern research results in computability theory, the book establishes novel directions in the field.
Jon Williamson
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780199228003
- eISBN:
- 9780191711060
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199228003.001.0001
- Subject:
- Mathematics, Probability / Statistics, Logic / Computer Science / Mathematical Philosophy
Bayesian epistemology aims to answer the following question: How strongly should an agent believe the various propositions expressible in her language? Subjective Bayesians hold that.it is largely ...
More
Bayesian epistemology aims to answer the following question: How strongly should an agent believe the various propositions expressible in her language? Subjective Bayesians hold that.it is largely (though not entirely) up to the agent as to which degrees of belief to adopt. Objective Bayesians, on the other hand, maintain that appropriate degrees of belief are largely (though not entirely) determined by the agent's evidence. This book states and defends a version of objective Bayesian epistemology. According to this version, objective Bayesianism is characterized by three norms: (i) Probability: degrees of belief should be probabilities; (ii) Calibration: they should be calibrated with evidence; and (iii) Equivocation: they should otherwise equivocate between basic outcomes. Objective Bayesianism has been challenged on a number of different fronts: for example, it has been accused of being poorly motivated, of failing to handle qualitative evidence, of yielding counter‐intuitive degrees of belief after updating, of suffering from a failure to learn from experience, of being computationally intractable, of being susceptible to paradox, of being language dependent, and of not being objective enough. The book argues that these criticisms can be met and that objective Bayesianism is a promising theory with an exciting agenda for further research.Less
Bayesian epistemology aims to answer the following question: How strongly should an agent believe the various propositions expressible in her language? Subjective Bayesians hold that.it is largely (though not entirely) up to the agent as to which degrees of belief to adopt. Objective Bayesians, on the other hand, maintain that appropriate degrees of belief are largely (though not entirely) determined by the agent's evidence. This book states and defends a version of objective Bayesian epistemology. According to this version, objective Bayesianism is characterized by three norms: (i) Probability: degrees of belief should be probabilities; (ii) Calibration: they should be calibrated with evidence; and (iii) Equivocation: they should otherwise equivocate between basic outcomes. Objective Bayesianism has been challenged on a number of different fronts: for example, it has been accused of being poorly motivated, of failing to handle qualitative evidence, of yielding counter‐intuitive degrees of belief after updating, of suffering from a failure to learn from experience, of being computationally intractable, of being susceptible to paradox, of being language dependent, and of not being objective enough. The book argues that these criticisms can be met and that objective Bayesianism is a promising theory with an exciting agenda for further research.
Dov M. Gabbay and Larisa Maksimova
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198511748
- eISBN:
- 9780191705779
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198511748.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This book focuses on interpolation and definability. This notion is not only central in pure logic, but has significant meaning and applicability in all areas where logic itself is applied, ...
More
This book focuses on interpolation and definability. This notion is not only central in pure logic, but has significant meaning and applicability in all areas where logic itself is applied, especially in computer science, artificial intelligence, logic programming, philosophy of science, and natural language. The book provides basic knowledge on interpolation and definability in logic, and contains a systematic account of material which has been presented in many papers. A variety of methods and results are presented beginning with the famous Beth's and Craig's theorems in classical predicate logic (1953-57), and to the most valuable achievements in non-classical topics on logic, mainly intuitionistic and modal logic. Together with semantical and proof-theoretic methods, close interrelations between logic and universal algebra are established and exploited.Less
This book focuses on interpolation and definability. This notion is not only central in pure logic, but has significant meaning and applicability in all areas where logic itself is applied, especially in computer science, artificial intelligence, logic programming, philosophy of science, and natural language. The book provides basic knowledge on interpolation and definability in logic, and contains a systematic account of material which has been presented in many papers. A variety of methods and results are presented beginning with the famous Beth's and Craig's theorems in classical predicate logic (1953-57), and to the most valuable achievements in non-classical topics on logic, mainly intuitionistic and modal logic. Together with semantical and proof-theoretic methods, close interrelations between logic and universal algebra are established and exploited.
Jon Williamson
- Published in print:
- 2017
- Published Online:
- March 2017
- ISBN:
- 9780199666478
- eISBN:
- 9780191749292
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199666478.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
Inductive logic (also known as confirmation theory) seeks to determine the extent to which the premisses of an argument entail its conclusion. This book offers an introduction to the field of ...
More
Inductive logic (also known as confirmation theory) seeks to determine the extent to which the premisses of an argument entail its conclusion. This book offers an introduction to the field of inductive logic and develops a new Bayesian inductive logic. Chapter 1 introduces perhaps the simplest and most natural account of inductive logic, classical inductive logic, which is attributable to Ludwig Wittgenstein. Classical inductive logic is seen to fail in a crucial way, so there is a need to develop more sophisticated inductive logics. Chapter 2 presents enough logic and probability theory for the reader to begin to study inductive logic, while Chapter 3 introduces the ways in which logic and probability can be combined in an inductive logic. Chapter 4 analyses the most influential approach to inductive logic, due to W.E. Johnson and Rudolf Carnap. Again, this logic is seen to be inadequate. Chapter 5 shows how an alternative approach to inductive logic follows naturally from the philosophical theory of objective Bayesian epistemology. This approach preserves the inferences that classical inductive logic gets right (Chapter 6). On the other hand, it also offers a way out of the problems that beset classical inductive logic (Chapter 7). Chapter 8 defends the approach by tackling several key criticisms that are often levelled at inductive logic. Chapter 9 presents a formal justification of the version of objective Bayesianism which underpins the approach. Chapter 10 explains what has been achieved and poses some open questions.Less
Inductive logic (also known as confirmation theory) seeks to determine the extent to which the premisses of an argument entail its conclusion. This book offers an introduction to the field of inductive logic and develops a new Bayesian inductive logic. Chapter 1 introduces perhaps the simplest and most natural account of inductive logic, classical inductive logic, which is attributable to Ludwig Wittgenstein. Classical inductive logic is seen to fail in a crucial way, so there is a need to develop more sophisticated inductive logics. Chapter 2 presents enough logic and probability theory for the reader to begin to study inductive logic, while Chapter 3 introduces the ways in which logic and probability can be combined in an inductive logic. Chapter 4 analyses the most influential approach to inductive logic, due to W.E. Johnson and Rudolf Carnap. Again, this logic is seen to be inadequate. Chapter 5 shows how an alternative approach to inductive logic follows naturally from the philosophical theory of objective Bayesian epistemology. This approach preserves the inferences that classical inductive logic gets right (Chapter 6). On the other hand, it also offers a way out of the problems that beset classical inductive logic (Chapter 7). Chapter 8 defends the approach by tackling several key criticisms that are often levelled at inductive logic. Chapter 9 presents a formal justification of the version of objective Bayesianism which underpins the approach. Chapter 10 explains what has been achieved and poses some open questions.
Max A. Little
- Published in print:
- 2019
- Published Online:
- October 2019
- ISBN:
- 9780198714934
- eISBN:
- 9780191879180
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198714934.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy, Mathematical Physics
Digital signal processing (DSP) is one of the ‘foundational’ engineering topics of the modern world, without which technologies such the mobile phone, television, CD and MP3 players, WiFi and radar, ...
More
Digital signal processing (DSP) is one of the ‘foundational’ engineering topics of the modern world, without which technologies such the mobile phone, television, CD and MP3 players, WiFi and radar, would not be possible. A relative newcomer by comparison, statistical machine learning is the theoretical backbone of exciting technologies such as automatic techniques for car registration plate recognition, speech recognition, stock market prediction, defect detection on assembly lines, robot guidance and autonomous car navigation. Statistical machine learning exploits the analogy between intelligent information processing in biological brains and sophisticated statistical modelling and inference. DSP and statistical machine learning are of such wide importance to the knowledge economy that both have undergone rapid changes and seen radical improvements in scope and applicability. Both make use of key topics in applied mathematics such as probability and statistics, algebra, calculus, graphs and networks. Intimate formal links between the two subjects exist and because of this many overlaps exist between the two subjects that can be exploited to produce new DSP tools of surprising utility, highly suited to the contemporary world of pervasive digital sensors and high-powered and yet cheap, computing hardware. This book gives a solid mathematical foundation to, and details the key concepts and algorithms in, this important topic.Less
Digital signal processing (DSP) is one of the ‘foundational’ engineering topics of the modern world, without which technologies such the mobile phone, television, CD and MP3 players, WiFi and radar, would not be possible. A relative newcomer by comparison, statistical machine learning is the theoretical backbone of exciting technologies such as automatic techniques for car registration plate recognition, speech recognition, stock market prediction, defect detection on assembly lines, robot guidance and autonomous car navigation. Statistical machine learning exploits the analogy between intelligent information processing in biological brains and sophisticated statistical modelling and inference. DSP and statistical machine learning are of such wide importance to the knowledge economy that both have undergone rapid changes and seen radical improvements in scope and applicability. Both make use of key topics in applied mathematics such as probability and statistics, algebra, calculus, graphs and networks. Intimate formal links between the two subjects exist and because of this many overlaps exist between the two subjects that can be exploited to produce new DSP tools of surprising utility, highly suited to the contemporary world of pervasive digital sensors and high-powered and yet cheap, computing hardware. This book gives a solid mathematical foundation to, and details the key concepts and algorithms in, this important topic.
José Ferreirós
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691167510
- eISBN:
- 9781400874002
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691167510.001.0001
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This book presents a new approach to the epistemology of mathematics by viewing mathematics as a human activity whose knowledge is intimately linked with practice. Charting an exciting new direction ...
More
This book presents a new approach to the epistemology of mathematics by viewing mathematics as a human activity whose knowledge is intimately linked with practice. Charting an exciting new direction in the philosophy of mathematics, the book uses the crucial idea of a continuum to provide an account of the development of mathematical knowledge that reflects the actual experience of doing math and makes sense of the perceived objectivity of mathematical results. Describing a historically oriented, agent-based philosophy of mathematics, the book shows how the mathematical tradition evolved from Euclidean geometry to the real numbers and set-theoretic structures. It argues for the need to take into account a whole web of mathematical and other practices that are learned and linked by agents, and whose interplay acts as a constraint. It demonstrates how advanced mathematics, far from being a priori, is based on hypotheses, in contrast to elementary math, which has strong cognitive and practical roots and therefore enjoys certainty. Offering a wealth of philosophical and historical insights, the book challenges us to rethink some of our most basic assumptions about mathematics, its objectivity, and its relationship to culture and science.Less
This book presents a new approach to the epistemology of mathematics by viewing mathematics as a human activity whose knowledge is intimately linked with practice. Charting an exciting new direction in the philosophy of mathematics, the book uses the crucial idea of a continuum to provide an account of the development of mathematical knowledge that reflects the actual experience of doing math and makes sense of the perceived objectivity of mathematical results. Describing a historically oriented, agent-based philosophy of mathematics, the book shows how the mathematical tradition evolved from Euclidean geometry to the real numbers and set-theoretic structures. It argues for the need to take into account a whole web of mathematical and other practices that are learned and linked by agents, and whose interplay acts as a constraint. It demonstrates how advanced mathematics, far from being a priori, is based on hypotheses, in contrast to elementary math, which has strong cognitive and practical roots and therefore enjoys certainty. Offering a wealth of philosophical and historical insights, the book challenges us to rethink some of our most basic assumptions about mathematics, its objectivity, and its relationship to culture and science.