Improving Optimization-Based Approximate Inference by Clamping Variables

Improving Optimization-Based Approximate Inference by Clamping Variables

Abstract

While central to the application of probabilis-tic models to discrete data, the problem ofmarginal inference is in general intractable andefficient approximation schemes need to exploitthe problem structure. Recently, there havebeen efforts to develop inference techniquesthat do not necessarily make factorization as-sumptions about the distribution, but rather ex-ploit the fact that sometimes there exist effi-cient algorithms for finding the MAP config-uration. In this paper, we theoretically provethat for discrete multi-label models the boundson the partition function obtained by two ofthese approaches, Perturb-and-MAP and thebound from the infinite Rényi divergence, canbe only improved by clamping any subset ofthe variables. For the case of log-supermodularmodels we provide a more detailed analysis anddevelop a set of efficient strategies for choos-ing the order in which the variables should beclamped. Finally, we present a number of nu-merical experiments showcasing the improve-ments obtained by the proposed methods onseveral models.

Grafik Top
Authors
  • Zhao, Junyao
  • Djolonga, Josip
  • Tschiatschek, Sebastian
  • Krause, Andreas
Grafik Top
Shortfacts
Category
Paper in Conference Proceedings or in Workshop Proceedings (Paper)
Event Title
Uncertainty in Artificial Intelligence (UAI)
Divisions
Data Mining and Machine Learning
Event Location
Sidney, Australia
Event Type
Conference
Event Dates
11.-15.08.2017
Series Name
Proceedings of the Thirty-Third Conference on Uncertainty in Artificial Intelligence
Date
11 August 2017
Official URL
http://auai.org/uai2017/proceedings/papers/259.pdf
Export
Grafik Top