Visual Attribute Transfer through Deep Image Analogy: https://arxiv.org/abs/1705.01088
We propose a new technique for visual attribute transfer across images that may have very different appearance but have perceptually similar semantic structure. By visual attribute transfer, we mean transfer of visual information (such as color, tone, texture, and style) from one image to another. For example, one image could be that of a painting or a sketch while the other is a photo of a real scene, and both depict the same type of scene.
Our technique finds semantically-meaningful dense correspondences between two input images. To accomplish this, it adapts the notion of “image analogy” with features extracted from a Deep Convolutional Neutral Network for matching; we call our technique Deep Image Analogy. A coarse-to-fine strategy is used to compute the nearest-neighbor field for generating the results. We validate the effectiveness of our proposed method in a variety of cases, including style/texture transfer, color/style swap, sketch/painting to photo, and time lapse.
Deep Image Analogy:
We have demonstrated a new technique for transferring visual attributes across semantically-related images. We adapted the notion of image analogy to a deep feature space for finding semantically meaningful dense correspondences. We show that our method outperforms previous methods where image pairs exhibit significant variance in their appearance including lighting, color, texture, and style. We have shown that our approach is widely applicable for visual attribute transfer in real-world images, as well as additional transfer challenges such as content-specific stylization, style to photo, and style to style. We believe this method may also be proven useful for a variety of computer graphics and vision applications that rely on semantic-level correspondences
Supplemental Material for “Visual Attribute
Transfer through Deep Image Analogy ”: https://liaojing.github.io/html/data/analogy_supplemental.pdf