Quentin F. Stout

Computer Science and Engineering, University of Michigan

**Abstract:**
This paper considers the problem of determining
L_{∞} isotonic regressions satisfying order constraints given by
a partially ordered set specified via a DAG G = (V,E).
Given a real-valued function *f* and a nonnegative-valued function
*w* (the weights) on V, an isotonic regression of (*f,w*) is
an order-preserving function on V which minimizes the weighted distance
to *f* among all order-preserving functions.
Unlike isotonic regressions for L_{p} for 1 < p < ∞,
the L_{∞} regression is not unique.
An L_{∞} regression of special interest is
the *strict L _{∞} isotonic regression* which is
the limit,
as

Several algorithms for generating non-strict L_{∞}
isotonic regressions have previously appeared
in the literature.
However, they do not always generate the expected regressions, even for
trivial DAGs.
For example, if the DAG is {1,2,3} with the natural ordering, the
function values are f(1)=1, f(2)=-1, and f(3)=0.5, and the weights are all 1,
then any isotonic regression must have value 0 at 1 and 2.
One would expect that the regression value generated by the algorithm
at 3 would be 0.5, but no previously studied general-purpose
algorithm for L_{∞} results in this.
However, strict isotonic regression does result in f(3)=0.5, as does
L_{p} isotonic regression for any p < ∞.
We examine the behavior of previous algorithms as mappings from weighted
functions over a DAG G to isotonic functions over G, showing
that the fastest algorithms are not
monotonic mappings, and no previously studied
algorithm preserves level set trimming (i.e., changing the function
to be its regression value on a level set of the regression).
In contrast, the strict L_{∞} isotonic
regression, and L_{p} regression for all 1 < p < ∞,
is monotonic and preserves level set trimming.

Algorithms are given showing that if the partially ordered set has
*n* vertices, then for linear or tree orderings a simplification
of the
pool adjacent violators (PAV) algorithm yields the strict isotonic
regression in Θ(n log n) time.
For arbitrary orderings it
can be determined in time Θ(TC(G) + m + m^{*} log m^{*}),
where TC(G) is the time to determine the transitive closure of G, m is the number of edges in the transitive closure, and m^{*} is the number of violating pairs (i.e., a pair u, v where u precedes v but f(u) > f(v)).

**Improvements:**

- Here is a simpler algorithm for determining the strict L
_{∞}isotonic regression for a general dag. It simplifies Algorithm B in the paper, replacing the dynamic priority queue with a single initial sort. In O-notation it is no faster, but the constants are much better. - One fact not pointed out in the paper is the time to find the strict regression for n points in d-dimensional space with component-wise ordering.
Since the transitive closure can be determined by pairwise comparison it takes only Θ(n
^{2}) time and thus the total time is Θ(n^{2}+ m^{*}log m^{*}). This is Θ(n^{2}log n) in the worst case, but in practice should be faster. - Below there is a link to the paper which includes an appendix that did not appear in the journal publication. I added the appendix since I didn't notice the result until after the paper was in proof. It shows that if a regression is monotonic and preserves level set trimming then it is the strict regression. See the paper for definitions of these terms.

**Keywords:** isotonic regression, nonparametric, shape constrained,
mini-max error, L_{∞}, uniform metric, Chebyshev metric,
partial order, linear order, tree, "best best" L_{∞} approximation, multivariate regression

**Complete paper**. This paper appears
in *Journal of Optimization Theory and Applications* 152 (2012),
pp. 121-135.

**Related work:** Here is information about the fastest
isotonic regression algorithms known to date, for a variety of orderings and metrics.
The most relevant of these to the current paper are
Algorithms for L_{∞} isotonic regression and L_{∞} isotonic regression for linear, multidimensional, and tree orders.

Copyright © 2010-2016 Quentin F. Stout. |