The Community for Technology Leaders
2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2017)
Honolulu, Hawaii, USA
July 21, 2017 to July 26, 2017
ISSN: 1063-6919
ISBN: 978-1-5386-0457-1
pp: 2770-2779
ABSTRACT
We propose StyleBank, which is composed of multiple convolution filter banks and each filter bank explicitly represents one style, for neural image style transfer. To transfer an image to a specific style, the corresponding filter bank is operated on top of the intermediate feature embedding produced by a single auto-encoder. The StyleBank and the auto-encoder are jointly learnt, where the learning is conducted in such a way that the auto-encoder does not encode any style information thanks to the flexibility introduced by the explicit filter bank representation. It also enables us to conduct incremental learning to add a new image style by learning a new filter bank while holding the auto-encoder fixed. The explicit style representation along with the flexible network design enables us to fuse styles at not only the image level, but also the region level. Our method is the first style transfer network that links back to traditional texton mapping methods, and hence provides new understanding on neural style transfer. Our method is easy to train, runs in real-time, and produces results that qualitatively better or at least comparable to existing methods.
INDEX TERMS
channel bank filters, feature extraction, image filtering, image representation, image texture, learning (artificial intelligence), neural nets
CITATION

D. Chen, L. Yuan, J. Liao, N. Yu and G. Hua, "StyleBank: An Explicit Representation for Neural Image Style Transfer," 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, Hawaii, USA, 2017, pp. 2770-2779.
doi:10.1109/CVPR.2017.296
170 ms
(Ver 3.3 (11022016))