Some game developers are far too shameless about generative AI use

I’m having to be steadily more careful on what I choose to cover here on GamingOnLinux, with more games releasing using generative AI.
To start things off to be clear — my current stance is generally to ignore games that use generative AI. My whole point is to support creative people, not machines. It’s quickly getting more difficult because some won’t disclose it at all (even though they’re supposed to), it’s steadily being used by more studios for all sorts of things big and small, and the AI notice on Steam is buried at the bottom of pages and sometimes I do forget to check before getting into a game and end up wasting my time when I see it later.
Which brings me to the latest, where the developers are just so completely brazen on how they’re using generative AI. It’s called GUG and the thing is, the idea of the game sounds really cool, but their full statement on their generative AI use really rubbed me the wrong way.
What is it?
GUG is a sandbox simulation roguelike where you generate monsters—”gugs”—by typing any word or phrase. Your choice of words determines how they look, behave and battle. Type “banana flamethrower” and you might get a spiky yellow pyromaniac. Type “emotional support gremlin” and get… Well, who really knows?
Sounds cool right? Then you see the Steam page AI notice:
The developers describe how their game uses AI Generated Content like this:
This game relies on AI to underpin its procedural image and functionality generation systems (the systems that allow for personalised creatures and their effects).
I looked a bit deeper, and found their full FAQ which they’ve linked in a Steam community post. This is where it just really irked me. Have a little read of the full quote:
Does GUG use ethical AI? What is the company’s stance on AI ethics?
As a company, we are frank and open about our usage of AI. To put it simply: we use language models and image models that are trained on content from the internet that’s been extracted without consideration for its copyright or licensing. Models of this size are huge for a reason. They must be able to store enormous amounts of information on the internet. When you type in “clam” or “bucket of milk” or “martian lawyers club” or “henry kissinger” you have a gug that (likely) incorporates content from that prompt — we can rely on the model’s vast understanding of content to help it synthesize knowledge into a usable format to inject into the game. The *same* is true of our image generation model. Our image model is fine-tuned on a collection of our own images. This allows the image model to faithfully utilize the game’s intended art direction, while incorporating the domain knowledge of the user’s request.
It is important to understand that the game is running on contentious technology. It is also important to understand that at the current state of knowledge in machine learning is impossible to achieve the flexibility of the gug generation system in any other way. Anyone who claims that you can “train” your own model from scratch solely on your own data is either misinformed, is conflating the concepts of “training” and “finetuning”, or has enormous amounts of cash and time.
Of course we want to emphasize: as active members of the machine learning community and longtime video game developers, we are internally working on alternatives and planning to move ourselves further and further from unethical and unfair uses of AI. For example, there are open-source projects trying to replicate the capabilities of current models, while using fair-licensed datasets. We believe that what we have built is interesting and fun enough to our players and represents a valid use of this technology in its nascent state.
Emphasis in the above quote mine.
With that in mind then, they’re fully aware that the generative AI models they’re using are trained on others work, scraped from all corners of the internet. It’s made their game GUG “interesting and fun enough” right now that who cares right? Why put in the hard work to make your own stuff when the totally intelligent machines can do it for you?
When we have 100s of games releasing on Steam constantly – why am I going to pick one where the developers aren’t even doing a lot of the work? From artwork to story text – why am I going to play or read it if it’s just made by a machine, what’s the point? The answer is — I’m not. I’m going to continue clicking ignore on them. And I hope to be able to stick by that stance for a long time.
Looking over on SteamDB, there’s now over 10,000 games with an AI disclosure notice. Of that number, 3,957 have a 2025 release date. It’s only going to keep increasing.
I suppose it’s yet another good thing about GamingOnLinux remaining 100% independent, I don’t have to cover anything. And at times I can just write an article like this, pointing out how the industry is just changing and throwing some thoughts up on it. I don’t like where this is going, but automation is inevitable in everything isn’t it? Higher ups always want to maximise everything and some people are just lazy. What bugs me the most though, is that the more you rely on AI generation – the less you learn and improve.