Book contents
- Frontmatter
- Contents
- Preface
- List of abbreviations
- Part I Understanding energy consumption
- Part II Energy management and conservation
- Part III Advanced energy optimization
- 11 Overview
- 12 Traffic scheduling
- 13 Exploiting multiple wireless network interfaces
- 14 Mobile cloud offloading
- 15 Example scenarios for energy optimization
- 16 Future trends
- Appendix A An energy profile application
- Index
- References
12 - Traffic scheduling
from Part III - Advanced energy optimization
Published online by Cambridge University Press: 05 August 2014
- Frontmatter
- Contents
- Preface
- List of abbreviations
- Part I Understanding energy consumption
- Part II Energy management and conservation
- Part III Advanced energy optimization
- 11 Overview
- 12 Traffic scheduling
- 13 Exploiting multiple wireless network interfaces
- 14 Mobile cloud offloading
- 15 Example scenarios for energy optimization
- 16 Future trends
- Appendix A An energy profile application
- Index
- References
Summary
Scheduling means timing actions in a proper way. Mobile systems undergo such decisions continuously. For example, when to transmit the next packet or when to switch to executing another process. Traditionally, computing systems have strived for maximum performance or throughput of a system when designing different kinds of scheduling mechanisms. Considering energy consumption provides a different perspective.
How scheduling saves energy
The first motivation for considering scheduling for the sake of energy efficiency is that smartphones and their subsystems are not perfectly power proportional, sometimes very far from it. In a perfectly power-proportional system the power draw would linearly scale with the workload throughout the entire range of possible workloads.
To understand the benefits of energy-aware scheduling, let us consider an example of wireless communication with a smartphone. Wireless communication typically provides better energy utility with a higher data rate. In other words, a higher data rate leads to fewer joules consumed per bit transmitted or received. Because of this, it would be more energy efficient to schedule multiple lower-rate data flows in a maximally overlapping manner than to schedule them individually.
Generally speaking, it is beneficial to exercise the above kind of “race-to-sleep” scheduling policy on subsystems whose power consumption scales sublinearly with the workload. WNIs exhibit such behavior because of their inherent tail energy, as shown by Pathak et al. [1]. The main challenge is to find the optimal scheduling algorithm that takes the possible constraints of the particular subsystem into account. Scheduling algorithms in general have been studied for several decades.
- Type
- Chapter
- Information
- Smartphone Energy ConsumptionModeling and Optimization, pp. 234 - 263Publisher: Cambridge University PressPrint publication year: 2014