Deep Image Style Transfer from Freeform Text

12/13/2022
by   Tejas Santanam, et al.
0

This paper creates a novel method of deep neural style transfer by generating style images from freeform user text input. The language model and style transfer model form a seamless pipeline that can create output images with similar losses and improved quality when compared to baseline style transfer methods. The language model returns a closely matching image given a style text and description input, which is then passed to the style transfer model with an input content image to create a final output. A proof-of-concept tool is also developed to integrate the models and demonstrate the effectiveness of deep image style transfer from freeform text.

READ FULL TEXT

page 4

page 5

page 6

research
11/14/2022

Replacing Language Model for Style Transfer

We introduce replacing language model (RLM), a sequence-to-sequence lang...
research
10/07/2022

FastCLIPStyler: Towards fast text-based image style transfer using style representation

Artistic style transfer is usually performed between two images, a style...
research
03/24/2022

Neural Neighbor Style Transfer

We propose Neural Neighbor Style Transfer (NNST), a pipeline that offers...
research
02/16/2020

Learning to Generate Multiple Style Transfer Outputs for an Input Sentence

Text style transfer refers to the task of rephrasing a given text in a d...
research
12/01/2021

CLIPstyler: Image Style Transfer with a Single Text Condition

Existing neural style transfer methods require reference style images to...
research
12/10/2018

Self-Contained Stylization via Steganography for Reverse and Serial Style Transfer

Style transfer has been widely applied to give real-world images a new a...
research
09/10/2019

Raiders of the Lost Art

Neural style transfer, first proposed by Gatys et al. (2015), can be use...

Please sign up or login with your details

Forgot password? Click here to reset