Book contents
- Frontmatter
- Contents
- Preface
- Part I Monte Carlo basics
- Part II Finite temperature
- Part III Zero temperature
- Part IV Other topics
- Appendix A Alias method
- Appendix B Rejection method
- Appendix C Extended-ensemble methods
- Appendix D Loop/cluster algorithms: SU(N) model
- Appendix E Long-range interactions
- Appendix F Thouless's theorem
- Appendix G Hubbard-Stratonovich transformations
- Appendix H Multi-electron propagator
- Appendix I Zero temperature determinant method
- Appendix J Anderson impurity model: chain representation
- Appendix K Anderson impurity model: action formulation
- Appendix L Continuous-time auxiliary-field algorithm
- Appendix M Continuous-time determinant algorithm
- Appendix N Correlated sampling
- Appendix O The Bryan algorithm
- References
- Index
Appendix A - Alias method
Published online by Cambridge University Press: 05 May 2016
- Frontmatter
- Contents
- Preface
- Part I Monte Carlo basics
- Part II Finite temperature
- Part III Zero temperature
- Part IV Other topics
- Appendix A Alias method
- Appendix B Rejection method
- Appendix C Extended-ensemble methods
- Appendix D Loop/cluster algorithms: SU(N) model
- Appendix E Long-range interactions
- Appendix F Thouless's theorem
- Appendix G Hubbard-Stratonovich transformations
- Appendix H Multi-electron propagator
- Appendix I Zero temperature determinant method
- Appendix J Anderson impurity model: chain representation
- Appendix K Anderson impurity model: action formulation
- Appendix L Continuous-time auxiliary-field algorithm
- Appendix M Continuous-time determinant algorithm
- Appendix N Correlated sampling
- Appendix O The Bryan algorithm
- References
- Index
Summary
The methods to sample from a discrete probability for n events, presented in Algorithms 1 and 2, become inefficient when n is large. To devise an efficient algorithm for large n, we must distinguish the case where the list of probabilities is constantly changing from the case where it remains unchanged. In the latter case, there are several ways to boost efficiency by performing a modestly expensive operation once and then using more efficient operations in subsequent samplings. One such approach is to sort the list of probabilities, an operation generally requiring O(n log n) operations, and use a bisection method, requiring O (log n) operations, to select the events from the list. What we really want, however, is a method requiring only O (1) operations. Walker's alias algorithm (Walker, 1977) has this property.
Suppose we need to repeatedly choose one of five colors randomly – red, blue, yellow, pink, and green – with weights 6, 1, 3, 2, and 8 (Fig. A.1). We first generate five sticks with lengths 6, 1, 3, 2, and 8 and paint each stick with the color that it represents. Next, we define a linked list of the sticks that are longer than the average (which is 4), and another for the sticks that are shorter than the average. In the present case, the long-stick list is (“red”→“green”), and the short-stick list is (“blue” → “yellow” → “pink”). We pick the first item from each list and cut the longer (red) stick in two pieces in such a way that if we join one of the two to the shorter (blue) stick, we obtain an average-length stick. As a result, we are left with a red stick of length 3 and a joint stick of length 4. Since the original red stick was made shorter than the average length, we remove it from the long-stick list and append it to the short-stick list. On the other hand, the original blue stick has become an average-length stick, so we remove it from the short-stick list. Then, we pick the first item from each list and repeat the same operations again and again. When finished, we have five sticks of average length, some with a single color and others with two colors.
- Type
- Chapter
- Information
- Quantum Monte Carlo MethodsAlgorithms for Lattice Models, pp. 416 - 417Publisher: Cambridge University PressPrint publication year: 2016