-
Constrained Structured Regression with Convolutional Neural Networks
-
Deepak Pathak and Philipp Kraehenbuehl and Stella X. Yu and Trevor Darrell
-
arXiv:1511.07497, Online, 23 November 2015
-
arXiv
-
Abstract
-
Convolutional Neural Networks (CNNs) have recently emerged as the dominant model in computer vision. If provided with enough training data, they predict almost any visual quantity. In a discrete setting, such as classification, CNNs are not only able to predict a label but often predict a confidence in the form of a probability distribution over the output space. In continuous regression tasks, such a probability estimate is often lacking. We present a regression framework which models the output distribution of neural networks. This output distribution allows us to infer the most likely labeling following a set of physical or modeling constraints. These constraints capture the intricate interplay between different input and output variables, and complement the output of a CNN. However, they may not hold everywhere. Our setup further allows to learn a confidence with which a constraint holds, in the form of a distribution of the constrain satisfaction. We evaluate our approach on the problem of intrinsic image decomposition, and show that constrained structured regression significantly increases the state-of-the-art.
-
Keywords
-
constrained structured regression, intrinsic image decomposition, deep learning, MPI Sintel, MIT intrinsics, intrinsics images in the wild
|