Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/108711
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: Decision-theoretic sparsification for Gaussian process preference learning
Author: Abbasnejad, M.
Bonilla, E.
Sanner, S.
Citation: Lecture Notes in Artificial Intelligence, 2013 / Blockeel, H., Kersting, K., Nijssen, S., Zelezný, F. (ed./s), vol.8189 LNAI, iss.PART 2, pp.515-530
Publisher: Springer
Issue Date: 2013
Series/Report no.: Lecture Notes in Computer Science; 8189
ISBN: 9783642409905
ISSN: 0302-9743
1611-3349
Conference Name: 2013 Joint European Conference on Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2013) (23 Sep 2013 - 27 Sep 2013 : Prague, Czech Republic)
Editor: Blockeel, H.
Kersting, K.
Nijssen, S.
Zelezný, F.
Statement of
Responsibility: 
M. Ehsan Abbasnejad, Edwin V. Bonilla, and Scott Sanner
Abstract: We propose a decision-theoretic sparsification method for Gaussian process preference learning. This method overcomes the loss-insensitive nature of popular sparsification approaches such as the Informative Vector Machine (IVM). Instead of selecting a subset of users and items as inducing points based on uncertainty-reduction principles, our sparsification approach is underpinned by decision theory and directly incorporates the loss function inherent to the underlying preference learning problem. We show that by selecting different specifications of the loss function, the IVM’s differential entropy criterion, a value of information criterion, and an upper confidence bound (UCB) criterion used in the bandit setting can all be recovered from our decision-theoretic framework. We refer to our method as the Valuable Vector Machine (VVM) as it selects the most useful items during sparsification to minimize the corresponding loss. We evaluate our approach on one synthetic and two real-world preference datasets, including one generated via Amazon Mechanical Turk and another collected from Facebook. Experiments show that variants of the VVM significantly outperform the IVM on all datasets under similar computational constraints.
Rights: © Springer-Verlag Berlin Heidelberg 2013
DOI: 10.1007/978-3-642-40991-2_33
Published version: http://dx.doi.org/10.1007/978-3-642-40991-2
Appears in Collections:Aurora harvest 8
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.