Safeguarding Your Art from AI: What You Need to Know

Steps you can take to prevent your art from being used by AI

The past 12 months have seen some crazy developments from the world of artificial intelligence.

Large language models like ChatGPT have completely upended the value of the written word, while apps like Midjourney are spitting out jaw-dropping images in seconds. But as advanced as today’s AI might seem, it had a pretty humble teacher…

Us.

It’s only through the analysis of billions of human-generated text and images that these AI are able to generate new content. While there might not be a backlash against text data being scanned from something like the Library of Congress or Project Gutenberg, images are a very different story.

In this post we’ll dive into why so many artists are taking measures to protect their art, and how you can safeguard your work from the hungry eyes of AI.

Why artists are worried about AI

When developers set out to create a image generator like DALL-E, Midjourney or Stable Diffusion, they start with a MASSIVE dataset containing billions of images with associated text that describes them. It’s only through learning from these datasets that the AI is able to mash together new content.

The problem is where exactly these images come from.

All of the above are AI-generated, yet whose mangled signatures are those?

For example, ask Midjourney to create a piece in the style of, say, Hellboy’s Mike Mignola and it will usually create a passable imitation. That’s bad news for artists who make a living with their distinct style of art. Not only does it devalue their work, but also makes it possible for con artists to try and pass that art off as their own.

It also means the AI has already scanned and assimilated a considerable amount of that artist’s work. Something most artists never agreed to.

How to check if your art has already been used by AI

Removing your art from the inner machinations of an image synthesizer is not an easy task. Once the data has been fed into the AI it’s very difficult to pluck one individual image out of the billions included.

With that said, there is a free tool you can use to check if your art has been included in these large datasets. It’s called Have I Been Trained and it’s essentially Google for image datasets.

You simply upload an image or text and the site will search LAION, the largest publicly available dataset library for that query. If your art should appear in the results, you can flag it for removal.

Searching HaveIBeenTrained.com reveals Mike Mignola’s art has DEFINITELY been incorporated into AI.

Opt out of AI where you can

Historically, art portfolio platforms like ArtStation, DeviantArt and Behance have been frequent sources for AI training. Due to significant public backlash, several sites have changed their policies about AI crawlers.

ArtStation has recently implemented a “NoAI” option for posts on the platform. When you tag your projects with “NoAI” ArtStation will automatically assign a custom meta tag to disallow the use of that content by AI systems.

Much to the ire of many users, the option is NOT enabled by default.

DeviantArt has introduced a similar out-out option that IS enabled by default. According to a recent update:

DeviantArt did not consent to third-party technology usage of images on our site, which were used to train AI models. In an effort to combat future unauthorized usage, we have enacted “noai” — an industry-first directive alerting AI models if deviants request for their work not to be used.

DeviantArt’s stance on AI

While the “NoAI” tag is a step in the right direction, it’s ultimately up to AI developers to respect user’s choices. It doesn’t technically block AI crawlers, but simply makes their presence against these website’s terms of service.

Watermarking and software solutions

If your art is viewable online, it’s unfortunately always possible for a renegade AI to view and assimilate it. But that doesn’t mean you can’t make it difficult for them…

Photographers have long turned to watermarking to protect the authenticity of their art. Embedding a name or signature across the image not only prevents other people from claiming it as their own, it also makes it hard for AI to properly “see”.

The watermark can become jumbled into the source content and ruin any future creations the AI might try to produce with your art.

There are several free options to start watermarking your art like Watermarkly or Fotor.

New solutions like Sanative AI are popping up that take watermarking a step further. Their free-to-use tool helps protect the work of creators who haven’t consented to having their images fed to AI generators.

They accomplish this by embedding images with a unique, invisible watermark that prevents those images from being trained into generative-AI.

Sanative’s invisible AI watermark in action

For extreme cases, treat your art like a password

While it’s possible one day AI-generated content will be required to be clearly labeled, for now we’re pretty much in the Wild West.

Safeguarding your art comes down to how much exposure you’re okay with. If the answer is, “the robots get my art over my cold, dead body,” there are still options for you.

Just like you wouldn’t post your email login on Twitter, the same approach can apply to original art.

  • Remove your art from social media or personal websites
  • Only share it directly with people you trust
  • Create .zip folders or PDFs when you need to show your portfolio
  • Print copies of your art and mail them as needed

Final thoughts

AI-generated art isn’t going away any time soon.

Staying one step ahead of AI (and its developers) means artists will need to decide how far they’re willing to go to protect their art.

In the end, it’s important to remember that art is and always will be a human-driven endeavor. No matter how advanced AI art becomes, it will never be capable of appreciating it’s own creation.


© Efrain Malo

Fuel your creativity!

Tips, resources and inspiration delivered straight to your inbox.

We don’t spam! Read our privacy policy for more info.