shut up, that just makes it easier because you have a needle. the godot pros break their finger against a concrete block and use the bone fragment that sticks out to individually edit the binary
I do believe that what makes people angry regarding AI, particularly in videogames, is that video games is still massively struggling to get recognized as an art form. Using LLM-based AI is directly ampering the chance to get this recognition going.
From my point-of-view, the usage of AI, if it persists (meaning if the enormous societal costs related to AI and the confiscation of all its benefits by a billionaire techno-fascist elite doesn’t end-up killing the social contracts in most societies), will end-up creating a schizm: on one hand, you will have entertainment products made at scale using AI that will have no artistic value (because it can’t by nature open-up a dialogue between the creator and the public); and o the other hand underfunded artistic expression in the form of videogames that will ban or limit drastically genAI usage in its production process.
To come back to the original post, this is what is currently puzzling. OP obviously doesn’t want to make a small cash-grab app as they are publishing it with an open licence, and they also say they love music, and it’s an important part of the game, which indicates some form of intentionality through the music. But at the same time, they remove any possible intentionality in their music by using a tool that is based on statistical inference and deprive them of control. The incoherence is quite striking to me.
In short, AI is a productivity tool that might allow one day to produce bigger video games at the cost of any artistic value (I leave aside the societal issue here, which is ofc huge). It’s okay. Maybe not all videogames need to be art. Maybe most videogames aren’t actually. Maybe, like most domain in art, the fringe will be where creativity and artistry will flourish, out-of-reach of capitalism
I don’t really care any more about any other aspect of “ai” except - data theft. It’s the only thing that needs to be constantly pointed out. Discussing deeper philosophical meanings of “ai”, art, future, society or whatever just detract from the banal fact that “ai” is essentially one big data theft operation. That’s not a metaphor. It’s actual theft under existing property laws in most civilized countries. Philosophical discussions just muddy the waters for our beloved techno-accelerationist thieves to continue getting away with their criminal activities, and sadly, frame them as “progress”.
Training needs to be regulated. Full public disclosure of datasets should be required. Train your shit bots on your own data or on whatever public domain data is available to you, but stay away from intellectual property that’s not yours. That’s all there is to it as far as I’m concerned.
Ah, but if courts rule it to not be illegal, it might be immoral, but not illegal. As are many things that involve making the rich richer.
Wikipedia is great example , it was used at very beginning. But why Open-AI doesn’t fund them but instead relay on donations ?
Yeah would improve quality , and pay more people to make dataset and check what can be used to comply with copyrights and licensing.
If live in UK you have to pay TV licence which supports BBC .
The BBC done lots of documentaries which more than likely were used .
If average person don’t have licence , they bother them . Why this is not same for those data centres ![]()
The world was better before AI.
Or more so when AI meant a Goomba walking back and forth…
I am not sure who started the “AI is immoral” thing, but people should really be concerned about the kids growing up with AI, if this AI thing keeps up, well you might as well be in the movie Idiocracy. if the next generation doesnt have to put up with this nonsense, then they will be better off.
We are in Idiocracy , Matrix and lots of other movies ![]()
*True, true.
To ‘reject the use of AI’ is synonymous to ‘reject the user of AI’ for all intents and purposes.
Consider that, in order to persuade people the merits of your argument, a positive example (i.e. practice what you preach) is a better than coercion.
You asked why people downvote and reject games that use AI assets, got lots of helpful detailed answers that you didn’t acknowledge, and then seemingly continued minimizing and defending your use of AI assets, while criticizing the people who reviewed your game negatively. It doesn’t seem like you’re sincere in your request to understand the reasons. It seems more like you’re complaining and objecting to the popular position that using AI sucks. Complaining that people care about things you don’t care about, won’t change the market.
If you don’t actually care about any of the reasons people are repelled by AI, or you don’t understand those reasons, and you want to make games people like, and you can’t afford expensive assets, then stop using AI assets, and switch to using inexpensive or free assets that aren’t AI generated. There are tons of free and low cost options for every kind of game asset. If you’re acting purely out of self interest, then you should avoid AI assets so you won’t drive so many of your customers away.
Well that was me referring to the fact that LLMs steal copyrighted works to train their data, plus the exorbitant power and water consumption that affect both the environment, and have already raised utility prices for people. Also, for those of us who live in places where there are already issues with droughts, the water consumption is concerning.
Agreed. But we are just now seeing the first lawsuits against social media companies. This week two were settled in favor of plaintiffs suing them. One in New Mexico for $375 Million against Meta. Another (I think in California, but the article doesn’t say) for $6 Million against Meta and YouTube (70/30 split).
There are already lawsuits against Grok for its ill-advised entrance into adult content. There are also laws being passed about AI, including addressing revenge images and blocking AI data centers being built in the US.
It depends on how it’s approached, as it can be more nuanced than that if people thought less in terms of black and white. But it’s an emotional issue for many. This whole thread was started because someone created a game with LLM generated resources, and was attacked for it upon release. They didn’t understand why, and asked why. So here we are.
Not everyone understands why this is so upsetting to people. So it shocks them when they feel attacked. But there’s a difference between saying, “You shouldn’t use an LLM because of these reasons,” and “You’re a bad person for using an LLM.”
Practicing what you preach is good, but doing it silently doesn’t communicate anything.
That can be a difficult thing to do in this day and age. Especially if you work in software. This article talks about the data just released by anthropic about how their LLMs are being used, and how there’s already a divide between people who know how to use LLMs and those who don’t.
I remember meeting a man who had told me about being in prison for 20 years. He told me when he went in, the Internet didn’t exist. When he was released, he was at the airport trying to find a pay phone to contact his family. He walked up to a security guard to ask where one was, and the guard lent him his cell phone to call his family. He’d only seen them on TV before, and it was a shock to him how much the world had changed. I imagine that kind of culture shock is going to hit a lot of people in the next few years as jobs change rapidly.
Especially since we have higher white-collar unemployment than blue-collar unemployment right now. Those people who are unemployed are going to have their jobs change on them while they’re out of the workforce. And so your choices are learn to use LLMs, or not have a job - regardless of how you feel about LLMs on a moral or ethical level. (This is the problem the Luddites were having in the 1800s.)
Take a look at all the LLM things that were presented at GDC this week. Here are two excerpts from the article:
Back in January, planners for the Game Developers Conference released their annual state of the games industry report for 2026, in which 52% of respondents reported that generative AI was used at their company, though only 36% said they’re using it as part of their jobs; some say it’s optional, at least for now. They mostly use the technology for research and brainstorming (81% of respondents), writing emails and scheduling (47%), or for code assistance (47%), among other tasks. But developers themselves are increasingly skeptical of generative AI, with 52% responding that it’s bad for the industry – up from 30% last year.
But as I walked around the halls of GDC 2026, I saw a stark juxtaposition: a handful of smaller games proudly using generative AI, and relative silence from the rest of the industry.
Related to this thread is an article on LLM-powered NPCs and how 95% of players out of 62 enjoyed interacting with them. However that’s a small sample set, we don’t know if they were informed they were LLM-powered, and they only played the game for 20 minutes apiece.
Juxtaposed with that is an article about a person who listened to only AI-generated music for a week and hated it. But what I notice is they didn’t make any music themselves. I have noticed that while I enjoy sometimes listening to new songs on Suno, I almost exclusively listen to stuff I made myself. (For me personally, this goes back to the tightrope I walk where I don’t like a lot of things about LLMs, but I need to stay current with them for work.)
My current project has 0$ budget and I have yet to use AI art for music or images.
I was able to squeeze 15$ to buy some UI assets and you can easily get Audio loops for about the same range (If not free for commercial use)
The only AI I use is for coding, mostly to learn (I’m not a coder) and to trouble shoot. And even then, I’m using it less every time.
Money is not a valid reason for using AI generated art while there is PLENTY of accessible assets of all kind.
Resources like open game art are amazing and contains a lot of fantastic resources
I don’t recommend using LLMs to learn Godot and GDScript. they do not understand the Godot editor, and give overly complicated coding solutions, often with fake function names or old code examples.
Also, if you don’t use LLM-generated art for ethical reasons, all those ethics still apply to the use of LLMs for coding. (Including the fact that being legally allowed to scrape people’s answers to programming questions on the internet does not make it necessarily ethical.) Beware of high horses - you can fall.
The same could be said about code.
Just a detail, and an honest question : Is the water used for these data centres not recycled, and eventually put back into the water system..? Where is the ‘consumption’ coming from..? Losses from steam or boiling off, or simple evaporation..?
(Disclaimer : I’m not trying to defend these data centres, simply wondering exactly what the issue here is…)
Um, can a moderator lock this forum thread, this has become ridiculous…
While it is true that water will eventually “return” as it evaporates, it really does not matter where it will return.
The main issue here is that a lot of data centers are in areas where water is already a scarce resource, and by having that water siphoned off, just for that water to maybe come back as rain somewhere else won’t help the local population at all. This will also quickly contribute to overall inbalance, where some areas will have too much water, and others will have barely any.
Save for a few of the comments being a bit light hearted there’s nothing ridiculous about this thread, and you saying people are being so is a bit offensive