- Previous work: approximately optimal rules
located and analyzed via asymptotics and simulation.
Asymptotics are a critical part of the overall process,
but are not very useful when one has to pick
specific allocations for specific sample sizes.
- We were quite surprised to discover that the growth rate of the
optimal initial stage sizes is not as predicted. This is also true
for optimal 3-stage allocation, and we have observed similar growth
rates in other problems.
- The optimal rule (OS) is necessary as basis for comparison - without it one
has no definate bound on how much is being lost when a suboptimal
procedure is used.
Previously many people felt that it was too difficult to compute OS, but we have shown that this can now be accomplished for useful sample sizes.

- New efficient algorithms were needed to perform the
extensive evaluations required for robustness/sensitivity analyses.
The most important new algorithm is forward
induction, along with careful
implementation.
Concurrent with our initial clinical trials work in 1991, using N=150, P. Jones (1992) noted that he could only handle N=25. Our results would have taken him approximately

**100,000 times as long**- i.e., his programs would still be running! (Please note that his programs were standard implementations, and the only reason he is explicitly pointed out is because his work was concurrent with ours and he explicitly mentioned what many other people only noted implicilty.) Since then, we have analyzed some problems with sample sizes as large as N=400. - Simple fully adaptive rules are good enough to
use in practice.
- Adaptive few-stage rules are remarkably good as well.
Note that there has been

**no**previous work on determining fully optimal few-stage designs. No one had obtained fully optimal stage sizes, nor had they even determined how to compute them. - Adaptive procedures are
**very**robust.

While the focus here was on estimation of bivariate polynomials, similar design/analysis techniques are applicable for many other problems.