Title :
Coupled space learning of image style transformation
Author :
Lin, Dahua ; Tang, Xiaoou
Author_Institution :
Dept. of Inf. Eng., The Chinese Univ. of Hong Kong
Abstract :
In this paper, we present a new learning framework for image style transforms. Considering that the images in different style representations constitute different vector spaces, we propose a novel framework called coupled space learning to learn the relations between different spaces and use them to infer the images from one style to another style. Observing that for each style, only the components correlated to the space of the target style are useful for inference, we first develop the correlative component analysis to pursue the embedded hidden subspaces that best preserve the inter-space correlation information. Then we develop the coupled bidirectional transform algorithm to estimate the transforms between the two embedded spaces, where the coupling between the forward transform and the backward transform is explicitly taken into account. To enhance the capability of modelling complex data, we further develop the coupled Gaussian mixture model to generalize our framework to a mixture-model architecture. The effectiveness of the framework is demonstrated in the applications including face super-resolution and bidirectional portrait style transforms
Keywords :
Gaussian processes; image processing; learning (artificial intelligence); transforms; backward transform; correlative component analysis; coupled Gaussian mixture model; coupled bidirectional transform; coupled space learning; forward transform; image style transformation; mixture-model architecture; vector spaces; Application software; Asia; Computer architecture; Computer vision; Face detection; Image reconstruction; Inference algorithms; Information analysis; Principal component analysis; Statistical learning;
Conference_Titel :
Computer Vision, 2005. ICCV 2005. Tenth IEEE International Conference on
Conference_Location :
Beijing
Print_ISBN :
0-7695-2334-X
DOI :
10.1109/ICCV.2005.65