Book contents
- Frontmatter
- Contents
- List of Figures
- List of Tables
- Preface
- 1 Introduction
- 2 The Perceptron
- 3 Logistic Regression
- 4 Implementing Text Classification Using Perceptron and Logistic Regression
- 5 Feed-Forward Neural Networks
- 6 Best Practices in Deep Learning
- 7 Implementing Text Classification with Feed-Forward Networks
- 8 Distributional Hypothesis and Representation Learning
- 9 Implementing Text Classification Using Word Embeddings
- 10 Recurrent Neural Networks
- 11 Implementing Part-of-Speech Tagging Using Recurrent Neural Networks
- 12 Contextualized Embeddings and Transformer Networks
- 13 Using Transformers with the Hugging Face Library
- 14 Encoder-Decoder Methods
- 15 Implementing Encoder-Decoder Methods
- 16 Neural Architectures for Natural Language Processing Applications
- Appendix A Overview of the Python Language and Key Libraries
- Appendix B Character Encodings: ASCII and Unicode
- References
- Index
4 - Implementing Text Classification Using Perceptron and Logistic Regression
Published online by Cambridge University Press: 01 February 2024
- Frontmatter
- Contents
- List of Figures
- List of Tables
- Preface
- 1 Introduction
- 2 The Perceptron
- 3 Logistic Regression
- 4 Implementing Text Classification Using Perceptron and Logistic Regression
- 5 Feed-Forward Neural Networks
- 6 Best Practices in Deep Learning
- 7 Implementing Text Classification with Feed-Forward Networks
- 8 Distributional Hypothesis and Representation Learning
- 9 Implementing Text Classification Using Word Embeddings
- 10 Recurrent Neural Networks
- 11 Implementing Part-of-Speech Tagging Using Recurrent Neural Networks
- 12 Contextualized Embeddings and Transformer Networks
- 13 Using Transformers with the Hugging Face Library
- 14 Encoder-Decoder Methods
- 15 Implementing Encoder-Decoder Methods
- 16 Neural Architectures for Natural Language Processing Applications
- Appendix A Overview of the Python Language and Key Libraries
- Appendix B Character Encodings: ASCII and Unicode
- References
- Index
Summary
In the previous chapters, we have discussed the theory behind the perceptron and logistic regression, including mathematical explanations of how and why they are able to learn from examples. In this chapter, we will transition from math to code. Specifically, we will discuss how to implement these models in the Python programming language. All the code that we will introduce throughout this book is available in this GitHub repository as well: https://github.com/clulab/gentlenlp. To get a better understanding of how these algorithms work under the hood, we will start by implementing them from scratch. However, as the book progresses, we will introduce some of the popular tools and libraries that make Python the language of choice for machine learning – for example, PyTorch, and Hugging Face’s transformers. The code for all the examples in the book is provided in the form of Jupyter notebooks. Fragments of these notebooks will be presented in the implementation chapters so that the reader has the whole picture just by reading the book.
Keywords
- Type
- Chapter
- Information
- Deep Learning for Natural Language ProcessingA Gentle Introduction, pp. 49 - 72Publisher: Cambridge University PressPrint publication year: 2024