Book contents
- Frontmatter
- Contents
- Preface
- Introduction
- Acknowledgments
- Contributors
- Acronyms and Abbreviations
- Boolean Models and Methods in Mathematics, Computer Science, and Engineering
- Part I Algebraic Structures
- Part II Logic
- Part III Learning Theory and Cryptography
- 6 Probabilistic Learning and Boolean Functions
- 7 Learning Boolean Functions with Queries
- 8 Boolean Functions for Cryptography and Error-Correcting Codes
- 9 Vectorial Boolean Functions for Cryptography
- Part IV Graph Representations and Efficient Computation Models
- Part IV Applications in Engineering
6 - Probabilistic Learning and Boolean Functions
from Part III - Learning Theory and Cryptography
Published online by Cambridge University Press: 05 June 2013
- Frontmatter
- Contents
- Preface
- Introduction
- Acknowledgments
- Contributors
- Acronyms and Abbreviations
- Boolean Models and Methods in Mathematics, Computer Science, and Engineering
- Part I Algebraic Structures
- Part II Logic
- Part III Learning Theory and Cryptography
- 6 Probabilistic Learning and Boolean Functions
- 7 Learning Boolean Functions with Queries
- 8 Boolean Functions for Cryptography and Error-Correcting Codes
- 9 Vectorial Boolean Functions for Cryptography
- Part IV Graph Representations and Efficient Computation Models
- Part IV Applications in Engineering
Summary
Introduction
This chapter explores the learnability of Boolean functions. Broadly speaking, the problem of interest is how to infer information about an unknown Boolean function given only information about its values on some points, together with the information that it belongs to a particular class of Boolean functions. This broad description can encompass many more precise formulations, but here we focus on probabilistic models of learning, in which the information about the function value on points is provided through its values on some randomly drawn sample, and in which the criteria for successful “learning” are defined using probability theory. Other approaches, such as “exact query learning” (see [1, 18, 20] and Chapter 7 in this volume, for instance) and “specification,” “testing,” or “learning with a helpful teacher” (see [12, 4, 16, 21, 26]) are possible, and particularly interesting in the context of Boolean functions. Here, however, we focus on probabilistic models and aim to give a fairly thorough account of what can be said in two such models.
In the probabilistic models discussed, there are two separate, but linked, issues of concern. First, there is the question of how much information is needed about the values of a function on points before a good approximation to the function can be found. Second, there is the question of how, algorithmically, we might find a good approximation to the function. These two issues are usually termed the sample complexity and computational complexity of learning.
- Type
- Chapter
- Information
- Publisher: Cambridge University PressPrint publication year: 2010
- 2
- Cited by