Connect with us

Hi, what are you looking for?


Artists are looking to protect their work from AI with anti-theft script

Artist vs AI the power struggle continues as artists explore more ways to regain control of their work

Photo credit: Joshua Reddekopp

As the battle between artists and generative AI continues, an artist has created a watermark generator to protect their work.

You know the saying, fight fire with fire? That sums up this idea pretty well. Artists have long had to deal with people trying to steal their work and claim it as their own. However with the evolution of AI, they also now have to contend with AI Art.

These generative AIs can create a piece of work from a simple sentence or choice of a few keywords. Artists have reported that some of their work has been taken from the internet. These pieces have then gone on to be used as part of the library available to AI programs.

Artists battle against AI

Some platforms have tried to put the power back in artists’ hands. Artstation now has the ability to opt out of being used for AI research. However, other artists are also struggling with their legitimate art being dubbed AI-generated. Finding out who or what is the original creator of a piece of work just got more complicated.

To tackle this issue an Artstation user under the name of Eballai, posted news of a Python script that adds an invisible watermark to PNG images. This watermark is written into the image itself rather than a visual watermark that can be seen by the naked eye. When AI bots scan the internet for images to collect for their databases they will be able to detect this watermark.

The purpose of the watermark is to make the AI skip over the artists’ pieces of work. This means that artists can potentially avoid their work being collected by an AI model which can then use their work for its own purpose in the future.

Not all it’s cracked up to be?

However, coders who saw the post went on to analyse the program and its code. Results seemed to show the 12 lines of the Python code are a nearly exact copy of the Stable Diffusion 2 code. Some relished pointing out the irony of an artist’s post about project rights not giving proper credit.

Artists continue to look for ways to protect their work, with the likes of traditional visual watermarks. Some have decided to take a hit to their visibility by incorrectly tagging their artwork. This makes it more difficult for AI algorithms to use their art. 

While the advancements we are seeing with AI generative tools are impressive, there is still a way to go before artists feel back in control of their work and therefore more accepting of the presence of AI.

Written By

Paige Cook is a writer with a multi-media background. She has experience covering video games and technology and also has freelance experience in video editing, graphic design, and photography. Paige is a massive fan of the movie industry and loves a good TV show, if she is not watching something interesting then she's probably playing video games or buried in a good book. Her latest addiction is virtual photography and currently spends far too much time taking pretty pictures in games rather than actually finishing them.

You May Also Like

Level Up

Eager to be at the metaverse frontier, but not sure how to get started? As exciting as the idea of a shared digital space...


New blockchain gaming platform based on Unreal Engine 5.


The record for the most expensive land sale in the metaverse has just been raised


Voice suppression tech prevents the real world from overhearing your in-metaverse conversations


Subscribe to the future