Joshua Landrum
Janis Hardwick
Quentin F. Stout
University of Michigan
Abstract: We give a sequential sampling approach to predict the time that a computer algorithm will take to solve a large problem of specified size on a given system. We use regression techniques to extrapolate observations of the time needed to solve smaller problems. The past observations and current predictions are used to adaptively decide the size of problem to be observed next. While based on classic ideas, some unique aspects occur in this setting. One is that observations of small inputs can be accomplished more quickly than observations for larger inputs, which raises interesting design optimization questions when one has a fixed amount of time in which to generate the prediction. Another is the change-point phenomena as one moves from a working set in cache to one in RAM (and perhaps even one larger than the RAM available). Our measurements have also uncovered some interesting behavior of the computing systems.
Keywords: computer performance prediction, benchmarking, response adaptive sampling design, predictive science, model checking
Complete paper. It appeared in Computing Science and Statistics 30 (1998), pp. 309-314.
Related Work: here is an explanation of adaptive sampling and here are our relevant publications.
Copyright © 2005-2016 Quentin F. Stout |