Book contents
- Frontmatter
- Contents
- Preface
- Part I Monte Carlo basics
- 1 Introduction
- 2 Monte Carlo basics
- 3 Data analysis
- 4 Monte Carlo for classical many-body problems
- 5 Quantum Monte Carlo primer
- Part II Finite temperature
- Part III Zero temperature
- Part IV Other topics
- Appendix A Alias method
- Appendix B Rejection method
- Appendix C Extended-ensemble methods
- Appendix D Loop/cluster algorithms: SU(N) model
- Appendix E Long-range interactions
- Appendix F Thouless's theorem
- Appendix G Hubbard-Stratonovich transformations
- Appendix H Multi-electron propagator
- Appendix I Zero temperature determinant method
- Appendix J Anderson impurity model: chain representation
- Appendix K Anderson impurity model: action formulation
- Appendix L Continuous-time auxiliary-field algorithm
- Appendix M Continuous-time determinant algorithm
- Appendix N Correlated sampling
- Appendix O The Bryan algorithm
- References
- Index
4 - Monte Carlo for classical many-body problems
from Part I - Monte Carlo basics
Published online by Cambridge University Press: 05 May 2016
- Frontmatter
- Contents
- Preface
- Part I Monte Carlo basics
- 1 Introduction
- 2 Monte Carlo basics
- 3 Data analysis
- 4 Monte Carlo for classical many-body problems
- 5 Quantum Monte Carlo primer
- Part II Finite temperature
- Part III Zero temperature
- Part IV Other topics
- Appendix A Alias method
- Appendix B Rejection method
- Appendix C Extended-ensemble methods
- Appendix D Loop/cluster algorithms: SU(N) model
- Appendix E Long-range interactions
- Appendix F Thouless's theorem
- Appendix G Hubbard-Stratonovich transformations
- Appendix H Multi-electron propagator
- Appendix I Zero temperature determinant method
- Appendix J Anderson impurity model: chain representation
- Appendix K Anderson impurity model: action formulation
- Appendix L Continuous-time auxiliary-field algorithm
- Appendix M Continuous-time determinant algorithm
- Appendix N Correlated sampling
- Appendix O The Bryan algorithm
- References
- Index
Summary
Our discussion of Markov chains, with the exception of mentioning the Metropolis and heat-bath algorithms, has so far been very general with little contact with issues and opportunities related to specific applications. In this chapter, we recall that our target is many-body problems defined on a lattice and introduce several frameworks exploiting what is special about Markov processes for these types of problems. We consider here classical many-body problems, using the Ising model as the representative. Our discussion will be extended to various quantum many-body problems and algorithms in subsequent chapters.
Many-body phase space
The numerical difficulty of studying many-body problems on a lattice arises from the fact that their phase space Ω is a direct product of many phase spaces of local degrees of freedom. Generally, a local phase space is associated with each lattice site. If n is the size of this phase space and N is the number of lattice sites, then the number of states |Ω| available to the whole system is nN. In other words, the number of states in the phase space grows exponentially fast with the physical size of the system. For example, in the Ising model, the Ising spin si on each site can take one of the two values ±1, and hence the number of states in the total phase space is 2N.
In Chapter 1, we noted that this exponential scaling thwarts deterministic solutions and is a reason why we use the Monte Carlo method. The exponentially large number of states implies that the enumeration of all states, which requires a computational effort proportional to the size of the phase space, is not an option for a problem with a large number of sites. As discussed in Section 2.7, the Monte Carlo method generally samples from a compressed phase space, avoiding the need to solve the entire problem. However, even with this advantage, can we reduce the computational effort to a manageable level?
An obvious requirement is that we can equilibrate the simulation in a reasonable amount of computer time. We know that the Markov process converges to a stationary state sampling some distribution (Section 2.4), and for detailed balance algorithms, Rosenbluth's theorem (Section 2.6) guarantees monotonic convergence.
- Type
- Chapter
- Information
- Quantum Monte Carlo MethodsAlgorithms for Lattice Models, pp. 66 - 83Publisher: Cambridge University PressPrint publication year: 2016