Differentiable Submodular Maximization

Differentiable Submodular Maximization

Abstract

We consider learning of submodular functions from data. These functions are important in machine learning and have a wide range of applications, e.g. data summarization, feature selection and active learning. Despite their combinatorial nature, submodular functions can be maximized approximately with strong theoretical guarantees in polynomial time. Typically, learning the submodular function and optimization of that function are treated separately, i.e. the function is first learned using a proxy objective and subsequently maximized. In contrast, we show how to perform learning and optimization jointly. By interpreting the output of greedy maximization algorithms as distributions over sequences of items and smoothening these distributions, we obtain a differentiable objective. In this way, we can differentiate through the maximization algorithms and optimize the model to work well with the optimization algorithm. We theoretically characterize the error made by our approach, yielding insights into the tradeoff of smoothness and accuracy. We demonstrate the effectiveness of our approach for jointly learning and optimizing on synthetic maximum cut data, and on real world applications such as product recommendation and image collection summarization.

Grafik Top
Authors
  • Tschiatschek, Sebastian
  • Sahin, Aytunc
  • Krause, Andreas
Grafik Top
Shortfacts
Category
Paper in Conference Proceedings or in Workshop Proceedings (Paper)
Event Title
International Conference on Artificial Intelligence (IJCAI)
Divisions
Data Mining and Machine Learning
Event Location
Stockholm, Sweden
Event Type
Conference
Event Dates
13.-19.06.2018
Series Name
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence Main track
ISSN/ISBN
978-0-9992411-2-7
Page Range
pp. 2731-2738
Date
13 July 2018
Official URL
https://www.ijcai.org/proceedings/2018/0379.pdf
Export
Grafik Top