Date Added: May 2010
In this paper, the authors present a dynamic load-balancing algorithm for optimistic gate level simulation making use of a machine learning approach. They first introduce two dynamic load-balancing algorithms oriented towards balancing the computational and communication load respectively in a Time Warp simulator. In addition, they utilize a multi-state Q-learning approach to create an algorithm which is a combination of the first two algorithms. The Q-learning algorithm determines the value of three important parameters-the number of processors which participate in the algorithm, the load which is exchanged during its execution and the type of load-balancing algorithm. They investigate the algorithm on gate level simulations of several open source VLSI circuits.