Not with my name! Inferring artists' names of input strings employed by Diffusion Models

07/25/2023
by   Roberto Leotta, et al.
2

Diffusion Models (DM) are highly effective at generating realistic, high-quality images. However, these models lack creativity and merely compose outputs based on their training data, guided by a textual input provided at creation time. Is it acceptable to generate images reminiscent of an artist, employing his name as input? This imply that if the DM is able to replicate an artist's work then it was trained on some or all of his artworks thus violating copyright. In this paper, a preliminary study to infer the probability of use of an artist's name in the input string of a generated image is presented. To this aim we focused only on images generated by the famous DALL-E 2 and collected images (both original and generated) of five renowned artists. Finally, a dedicated Siamese Neural Network was employed to have a first kind of probability. Experimental results demonstrate that our approach is an optimal starting point and can be employed as a prior for predicting a complete input string of an investigated image. Dataset and code are available at: https://github.com/ictlab-unict/not-with-my-name .

READ FULL TEXT
research
03/16/2023

DIRE for Diffusion-Generated Image Detection

Diffusion models have shown remarkable success in visual synthesis, but ...
research
07/06/2023

Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback

Diffusion models have recently shown remarkable success in high-quality ...
research
03/30/2023

Discriminative Class Tokens for Text-to-Image Diffusion Models

Recent advances in text-to-image diffusion models have enabled the gener...
research
12/29/2019

Very Long Natural Scenery Image Prediction by Outpainting

Comparing to image inpainting, image outpainting receives less attention...
research
02/14/2023

Universal Guidance for Diffusion Models

Typical diffusion models are trained to accept a particular form of cond...
research
10/02/2022

OCD: Learning to Overfit with Conditional Diffusion Models

We present a dynamic model in which the weights are conditioned on an in...
research
05/26/2023

An Efficient Membership Inference Attack for the Diffusion Model by Proximal Initialization

Recently, diffusion models have achieved remarkable success in generatin...

Please sign up or login with your details

Forgot password? Click here to reset