Imperial College London
Describing exogenous variability in the resources used by a cloud application leads to stochastic performance models that are difficult to solve. In this paper, the authors describe the blending algorithm, a novel approximation for queuing network models immersed in a random environment. Random environments are Markov chain-based descriptions of time-varying operational conditions that evolve independently of the system state; therefore they are natural descriptors for exogenous variability in a cloud deployment. The algorithm adopts the principle of solving a separate transient-analysis sub-problem for each state of the random environment.