Convex Combinations of Maximum Margin Bayesian Network Classifiers

Convex Combinations of Maximum Margin Bayesian Network Classifiers

Abstract

Maximum margin Bayesian networks (MMBN) can be trained by solving a convex optimization problem using, for example, interior point (IP) methods (Guo et al., 2005). However, for large datasets this training is computationally expensive (in terms of runtime and memory requirements). Therefore, we propose a less resource intensive batch method to approximately learn a MMBN classifier: we train a set of (weak) MMBN classifiers on subsets of the training data, and then exploit the convexity of the original optimization problem to obtain an approximate solution, i.e., we determine a convex combination of the weak classifiers. In experiments on different datasets we obtain similar results as for optimal MMBN determined on all training samples. However, in terms of computational efficiency (runtime) we are faster and the memory requirements are much lower. Further, the proposed method facilitates parallel implementation.

Grafik Top
Authors
  • Tschiatschek, Sebastian
  • Pernkopf, Franz
Grafik Top
Shortfacts
Category
Paper in Conference Proceedings or in Workshop Proceedings (Paper)
Event Title
International Conference on Pattern Recognition Applications and Methods (ICPRAM)
Divisions
Data Mining and Machine Learning
Event Location
Vilamoura, Portugal
Event Type
Conference
Event Dates
06.-08.02.2012
Series Name
Proceedings of the 1st International Conference on Pattern Recognition Applications and Methods - Volume 2: ICPRAM
ISSN/ISBN
2184-4313/978-989-8425-98-0
Page Range
pp. 69-77
Date
6 February 2012
Export
Grafik Top