2008 IEEE Conference on Computer Vision and Pattern Recognition (2008)
Anchorage, AK, USA
June 23, 2008 to June 28, 2008
Ariadna Quattoni , MIT CSAIL, USA
Michael Collins , MIT CSAIL, USA
Trevor Darrell , UC Berkeley EECS&ICSI, USA
To learn a new visual category from few examples, prior knowledge from unlabeled data as well as previous related categories may be useful. We develop a new method for transfer learning which exploits available unlabeled data and an arbitrary kernel function; we form a representation based on kernel distances to a large set of unlabeled data points. To transfer knowledge from previous related problems we observe that a category might be learnable using only a small subset of reference prototypes. Related problems may share a significant number of relevant prototypes; we find such a concise representation by performing a joint loss minimization over the training sets of related problems with a shared regularization penalty that minimizes the total number of prototypes involved in the approximation. This optimization problem can be formulated as a linear program that can be solved efficiently. We conduct experiments on a news-topic prediction task where the goal is to predict whether an image belongs to a particular news topic. Our results show that when only few examples are available for training a target topic, leveraging knowledge learnt from other topics can significantly improve performance.
M. Collins, T. Darrell and A. Quattoni, "Transfer learning for image classification with sparse prototype representations," 2008 IEEE Conference on Computer Vision and Pattern Recognition(CVPR), Anchorage, AK, USA, 2008, pp. 1-8.