Book contents
- Frontmatter
- Contents
- Preface
- Prologue: A machine learning sampler
- 1 The ingredients of machine learning
- 2 Binary classification and related tasks
- 3 Beyond binary classification
- 4 Concept learning
- 5 Tree models
- 6 Rule models
- 7 Linear models
- 8 Distance-based models
- 9 Probabilistic models
- 10 Features
- 11 Model ensembles
- 12 Machine learning experiments
- Epilogue: Where to go from here
- Important points to remember
- References
- Index
Preface
Published online by Cambridge University Press: 05 November 2012
- Frontmatter
- Contents
- Preface
- Prologue: A machine learning sampler
- 1 The ingredients of machine learning
- 2 Binary classification and related tasks
- 3 Beyond binary classification
- 4 Concept learning
- 5 Tree models
- 6 Rule models
- 7 Linear models
- 8 Distance-based models
- 9 Probabilistic models
- 10 Features
- 11 Model ensembles
- 12 Machine learning experiments
- Epilogue: Where to go from here
- Important points to remember
- References
- Index
Summary
This book started life in the Summer of 2008, when my employer, the University of Bristol, awarded me a one-year research fellowship. I decided to embark on writing a general introduction to machine learning, for two reasons. One was that there was scope for such a book, to complement the many more specialist texts that are available; the other was that through writing I would learn new things – after all, the best way to learn is to teach.
The challenge facing anyone attempting to write an introductory machine learn- ing text is to do justice to the incredible richness of the machine learning field without losing sight of its unifying principles. Put too much emphasis on the diversity of the discipline and you risk ending up with a ‘cookbook’ without much coherence; stress your favourite paradigm too much and you may leave out too much of the other in- teresting stuff. Partly through a process of trial and error, I arrived at the approach embodied in the book, which is is to emphasise both unity and diversity: unity by separate treatment of tasks and features, both of which are common across any machine learning approach but are often taken for granted; and diversity through coverage of a wide range of logical, geometric and probabilistic models.
Clearly, one cannot hope to cover all of machine learning to any reasonable depth within the confines of 400 pages.
- Type
- Chapter
- Information
- Machine LearningThe Art and Science of Algorithms that Make Sense of Data, pp. xv - xviiiPublisher: Cambridge University PressPrint publication year: 2012