AP-Perf: Incorporating Generic Performance Metrics in Differentiable Learning
We propose a method that enables practitioners to conveniently incorporate custom non-decomposable performance metrics into differentiable learning pipelines, notably those based upon deep learning architectures. Our approach is based on the recently-developed adversarial prediction framework, a distributionally robust approach that optimizes a metric in the worst case given the statistical summary of the empirical distribution. We formulate a marginal distribution technique to reduce the complexity of optimizing the adversarial prediction formulation over a vast range of non-decomposable metrics. We demonstrate how easy it is to write and incorporate complex custom metrics using our provided tool. Finally, we show the effectiveness of our approach for image classification tasks using MNIST and Fashion-MNIST datasets as well as classification task on tabular data using UCI repository and benchmark datasets.
READ FULL TEXT