Skip to main content

PairedCycleGAN: Asymmetric Style Transfer for Applying and Removing Makeup

Author(s): Chang, Huiwen; Lu, Jingwan; Yu, Fisher; Finkelstein, Adam

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1v85f
Abstract: This paper introduces an automatic method for editing a portrait photo so that the subject appears to be wearing makeup in the style of another person in a reference photo. Our unsupervised learning approach relies on a new framework of cycle-consistent generative adversarial networks. Different from the image domain transfer problem, our style transfer problem involves two asymmetric functions: a forward function encodes example-based style transfer, whereas a backward function removes the style. We construct two coupled networks to implement these functions - one that transfers makeup style and a second that can remove makeup - such that the output of their successive application to an input photo will match the input. The learned style network can then quickly apply an arbitrary makeup style to an arbitrary photo. We demonstrate the effectiveness on a broad range of portraits and styles.
Publication Date: 2018
Citation: Chang, Huiwen, Jingwan Lu, Fisher Yu, and Adam Finkelstein. "PairedCycleGAN: Asymmetric Style Transfer for Applying and Removing Makeup." In IEEE/CVF Conference on Computer Vision and Pattern Recognition (2018): pp. 40-48. doi:10.1109/CVPR.2018.00012
DOI: 10.1109/CVPR.2018.00012
EISSN: 2575-7075
Pages: 40 - 48
Type of Material: Conference Article
Journal/Proceeding Title: IEEE/CVF Conference on Computer Vision and Pattern Recognition
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.