Server Frequency Control Using Markov Decision Processes

Free registration required

Executive Summary

For a wide range of devices and servers, Dynamic Frequency Scaling (DFS) can reduce energy consumption to various degrees by appropriately trading-off system performance. Efficient DFS policies are able to adjust server frequencies by extrapolating the transition of the highly varying workload without incurring much of implementation overhead. This paper models DFS policies of a single server using Markov Decision Processes (MDP). To accommodate the highly varying nature of workload in the proposed MDP, the authors adopt fluid approximation based on continuous time Markov chain and discrete time Markov chain modeling for the fluid workload generator respectively.

  • Format: PDF
  • Size: 215.4 KB