-
Learning from Disagreements: Discriminative Performance Evaluation
-
Christina Pavlopoulou and David Martin and Stella X. Yu and Hao Jiang
-
The 11th IEEE International Workshop on Performance Evaluation of Tracking and Surveillance, Miami Beach, Florida, 25 June 2009
-
Paper
|
Slides
-
Abstract
-
Selecting test cases in order to evaluate computer vision methods is important, albeit has not been addressed before. If the methods are evaluated on examples on which they perform very well or very poorly then no reliable conclusions can be made regarding the superiority of one method versus the others. In this paper we put forth the idea that algorithms should be evaluated on test cases they disagree most. We present a simple method which identifies the test cases that should be taken into account when comparing two algorithms and at the same time assesses the statistical significance of the differences in performance. We employ our methodology to compare two object detection algorithms and demonstrate its usefulness in enhancing the differences between the methods.
-
Keywords
-
discriminative performance evaluation, object detection, object tracking
|