Book contents
- Frontmatter
- Contents
- Preface
- Part I Background
- Part II Applications, tools, and tasks
- Interlude — Good practices for scientific computing
- Part III Fundamentals
- Chapter 21 Networks demand network thinking: the friendship paradox
- Chapter 22 Network models
- Chapter 23 Statistical models and inference
- Chapter 24 Uncertainty quantification and error analysis
- Chapter 25 Ghost in the matrix: spectral methods for networks
- Chapter 26 Embedding and machine learning
- Chapter 27 Big data and scalability
- Conclusion
- Bibliography
- Index
Chapter 26 - Embedding and machine learning
from Part III - Fundamentals
Published online by Cambridge University Press: 06 June 2024
- Frontmatter
- Contents
- Preface
- Part I Background
- Part II Applications, tools, and tasks
- Interlude — Good practices for scientific computing
- Part III Fundamentals
- Chapter 21 Networks demand network thinking: the friendship paradox
- Chapter 22 Network models
- Chapter 23 Statistical models and inference
- Chapter 24 Uncertainty quantification and error analysis
- Chapter 25 Ghost in the matrix: spectral methods for networks
- Chapter 26 Embedding and machine learning
- Chapter 27 Big data and scalability
- Conclusion
- Bibliography
- Index
Summary
Machine learning, especially neural network methods, is increasingly important in network analysis. This chapter will discuss the theoretical aspects of network embedding methods and graph neural networks. As we have seen, much of the success of advanced machine learning is thanks to useful representations—embeddings—of data. Embedding and machine learning are closely aligned. Translating network elements to embedding vectors and sending those vectors as features to a predictive model often leads to a simpler, more performant model than trying to work directly with the network. Embeddings help with network learning tasks, from node classification to link prediction. We can even embed entire networks and then use models to summarize and compare networks. But not only does machine learning benefit from embeddings, but embeddings benefit from machine learning. Inspired by the incredible recent progress with natural language data, embeddings created by predictive models are becoming more useful and important. Often these embeddings are produced by neural networks of various flavors, and we explore current approaches for using neural networks on network data.
Keywords
- Type
- Chapter
- Information
- Working with Network DataA Data Science Perspective, pp. 429 - 446Publisher: Cambridge University PressPrint publication year: 2024