Quentin F. Stout

Computer Science and Engineering

University of Michigan

**Abstract**:
This paper gives algorithms for determining
univariate unimodal regressions, that is, for determining the optimal
regression which is increasing and then decreasing (more technically, nondecreasing and then nonincreasing).
Such regressions arise in a wide variety of applications.
They are shape-constrained nonparametric
regressions, closely related to isotonic regression (i.e., regression where the regression function is nondecreasing).
Unimodal orderings are occasionally called umbrella orderings.

For unimodal regression on *n* weighted points given in
increasing abscissa order, the time required is

- L
_{2}metric: O(*n*), - L
_{1}metric: O(*n*log*n*), - L
_{∞}metric: O(*n*), though here the points must be unweighted.

Note that the optimal isotonic or unimodal regressions are not unique for
L_{1} and L_{∞}.
E.g., for unweighted data 4, 0, 5, an optimal L_{1} isotonic regression is of the form x, x, 5, where x can be any value in [0,4], while an optimal L_{∞} regression is of the form 2, 2, y, where y can be any value in [3,7].
Isotonic regression for L_{2} is unique, but unimodal in general isn't, as is shown by the unweighted data 2, 0, 2, which has optimal unimodal regressions of 1, 1, 2 or 2, 1, 1.

Previous algorithms were solely for the L_{2} metric and
required Ω(*n*^{2}) time.
All previous algorithms
used multiple calls to isotonic regression, and our major contribution
is to organize these into a prefix isotonic regression,
determining the regression on all initial segments. The prefix approach
reduces the total time required by
utilizing the solution for one initial segment to solve the next.
The prefix algorithm is also useful when one is performing isotonic
regression for an on-going time series, since it allows one to
add new data with only a constant amortized amount of time per new
data point (for the L_{2} metric).

Algorithms are also provided for pointwise evaluations throughout the history of the data.
That is, after the prefix isotonic regressions have been constructed,
then for the isotonic regression on the first *m*
values, 0 < *m* ≤ *n*, one can determine

- the value of the regression at
*x*, - the range of
*x*'s for which the regression has given value*y*,

Note that the prefix approach can be applied to several other approximations when only 2 pieces are desired. For example, piecewise monotonic (i.e., each piece is either monotonic increasing or monotonic decreasing), piecewise constant (also known as step functions or histogramming), piecewise linear, and piecewise quadratic.

**Keywords**:
unimodal regression, isotonic regression, median regression,
umbrella ordering, minimax, least squares, piecewise monotonic,
prefix scan, pool adjacent violators (PAV)

**Complete paper.**
This paper appears in *Computational Statistics and Data Analysis*
**53** (2008), p. 289-297. doi:10.1016/j.csda.2008.08.005
It was submitted in 2003, and simultaneously put on the web, but a
series of absurd delays occured (one of them due to me) so that
the journal version appeared years later.
A preliminary version appeared in
``Optimal algorithms for unimodal regression'',
*Computing Science and Statistics* 32 (2000).
Here is a link to it, but the journal version is much better and is the one that should be cited.

**Related work:**
My interest in this problem arose from its use in
adaptive sampling designs
for dose-response optimization in
phase I/II clinical trial.

Here is information on the fastest isotonic regression algorithms known to date.
Here are papers on L_{1} isotonic regression (also known as isotonic median regression),
L_{∞} isotonic regression for arbitrary dags
(also known as minimax, uniform, or Chebyshev isotonic regression),
isotonic regression for points in multidimensional space,
L_{∞} isotonic regression for linear, multidimensional, and tree orders,
and isotonic regression
with a fixed number of level sets.
The paper on L_{1} isotonic regression also includes results
for L_{p}, 1 < p < ∞.

Copyright © 2003-2016 Quentin F. Stout |