The AI witch hunt claims its first victim, apparently over some placeholder textures.
https://english.elpais.com/culture/2025-07-19/the-low-cost-c...
> Sandfall Interactive further clarifies that there are no generative AI-created assets in the game. When the first AI tools became available in 2022, some members of the team briefly experimented with them to generate temporary placeholder textures. Upon release, instances of a placeholder texture were removed within 5 days to be replaced with the correct textures that had always been intended for release, but were missed during the Quality Assurance process.
From the submitted article:
> "When it was submitted for consideration, representatives of Sandfall Interactive agreed that no gen AI was used in the development of Clair Obscur: Expedition 33. In light of Sandfall Interactive confirming the use of gen AI art in production on the day of the Indie Game Awards 2025 premiere, this does disqualify Clair Obscur: Expedition 33 from its nomination."
Whatever placeholder you use is part of your development process, whether it ships or not. Saying they used none when they did is not cool and rightfully makes one wonder what other uses they may be hiding (or “forgetting”). Especially when apparently they only clarified it when it was too late.
I can understand the Indie Game Awards preferring to act now. Had they done nothing, they would have been criticised too by other people for not enforcing their own rules. They no doubt would’ve preferred to not have to deal with the controversy. Surely this wasn’t an easy decision for them, as it ruined their ceremony.
We’re all bystanders here with very little information, so I’d refrain from using unserious expressions like “witch hunt”, especially considering their more recent connotations (i.e. in modern times, “witch hunt” is most often used by bad actors attempting to discredit legitimate investigations).
> Whatever placeholder you use is part of your development process, whether it ships or not. Saying they used none when they did is not cool and rightfully makes one wonder what other uses they may be hiding (or “forgetting”). Especially when apparently they only clarified it when it was too late.
If it was malicious they wouldn't say a word. They probably interpreted the rule as "nothing in shipped game is AI" (which is reasonable interpreteation IMO), they implemented policy to replace any asset made by AI and just missed some texture.
Also the term was pretty vague, like, is using automatic lipsync forbidden ? That's pretty much generative AI, just the result is not picture but a sequence of movements.
Malicious or not, they didn't follow the rules, and admitted as much. So why is it a problem they lose the award?
That’s incredibly harsh. A blanket ban on AI generated assets is dumb as hell. Generating placeholder assets is completely acceptable.
I don’t care if the whole game from end to end is generative AI if it’s an incredible game. Having a moral stance against a specific use of floating point numbers and algorithms in a medium filled with floating point numbers and algorithms is strange.
it's even worse then that
there is a whole basked of technologies which you can label as "gen AI" but which have non of the problems why people hate "gen AI"
as a very dump example, some pretty decent "line smoothing" algorithm are technically gen AI but have non of the ethical issues
Is this actually a problem? Is there anybody actually arguing against line smoothing algorithms?
That’s the point. No one cares about line smoothing algorithms but they lose their mind if it’s background textures or throwaway voice lines.
No artists were previously smoothing lines for a living but they were painting textures and voice acting
They're not equivalent...
> technically gen AI but have non of the ethical issues
I don't know! I guess we'll have to wait for next year's Indie Game awards to see which prizes they retract that time and why. This is dumb.
Generating a brick wall texture using an AI should be acceptable as well, even when it's not a placeholder.
Yeah I'm fine with replacing generic stuff with generic AI stuff. Or cutting out the boring part, nobody needs to spend hours manually lip-syncing character or generating thousands of intermediate movement animation steps.
When genAI started making waves my first thought literally was how awesome it would be to flesh out NPC dialog.
It’s immersion breaking to try and talk to a random character only to hit a loop of one or two sentences.
How awesome would it be for every character to be able to have believable small talk, with only a small prompt? And it wouldn’t affect the overall game at all, because obviously the designers never cared to put in that work themselves
GenAI doing chore work is IMO the best use case
I agree, even though I'm not in favour of gen ai. It was a terrible mistake letting placeholder assets get out in the final release, but it shouldn't actually count as shipping AI-generated content in your product.
> representatives of Sandfall Interactive agreed that no gen AI was used in the development of Clair Obscur: Expedition 33
Even if all the AI-generated content had been replaced before release, this would still be a lie.
That's a dumb requirement for something purporting to be a general indie game award.
They should rename to the Digital Amish game awards or something.
not really. the spirit of the (dumb) rule is that AI was not involved anywhere in the creative process. placeholder textures don't even come close
it's like having doping rules in sports and then disqualifying someone for using caffeine in their gym plants.
It literally is shipping AI generated content in the product.
> It literally is shipping AI generated content in the product.
When someone goes three miles per hour over the speed limit they are literally breaking the law, but that doesn’t mean they should get a serious fine for it. Sometimes shit happens.
Sure, but maybe you you shouldn't be sueprised to be disqualified from that "Best Drive of the Year" award as you do.
Countries with sane laws include a tolerance limit to take into account flaws in speedometers and radars. Here in Brazil, the tolerance is 10%, so tickets clearly state "driving at speed 10% above limit".
and the rules of the contest did not include any sane boundaries.
Like, using automatic lipsync is "generative AI", should that be banned ? Do we really want to fight with that purely work-saving feature ?
That is not sane, it is dumb. With such a system, you have signs that say "100" but the actual speed limit is "110" and everyone knows the actual speed limit is "110" but they all have to do mental math to reach that conclusion. Just make the sign say the real speed limit instead of lying to you. It's like Spinal Tap wrote your laws.
It’s not dumb, it’s accounting for real world variance in car speedometer accuracy and possible inaccuracies in the measurement process, just because your car is telling you you went 98 or the speed camera is telling you you went 101 doesn’t mean that was the actual speed of your car at the moment.
Speed limits are limits, not targets. That's why they're called speed *limits*. You account for variance in the speedometer and the reading device by staying under the limit, not treating it as a target.
I hope this does not come across as antagonistic but isn’t this then another form of mental math again? "I’m actually not allowed to drive the number on the sign but I’m also not allowed to drive a speed within the margin of error so I could be falsely accused of speeding."
The other way around seems more clear in a legal sense to me because we want to prove with as little doubt as possible that the person actually went above the speed limit. Innocent until proven guilty and all that. So we accept people speeding a little to not falsely convict someone.
As a driver, I control my speed for a variety of factors, but I assume no responsibility for the variance in the speed checking device. That’s on the people deploying them to ensure they’ve done their job (and is part of the reason tickets aren’t issued for 1kph/1mph over in most jurisdictions).
Powerful autism or equally powerful bait. Can't decide myself.
> That is not sane, it is dumb.
I understand where you’re coming from, but it’s perfectly sane if your legal system recognizes and accepts that speed detection methodologies have a defined margin of error; every ticket issued for speeding within that MoE would likely be (correctly) rejected by a court if challenged.
The buffer means, among other things, that you don’t have to bog down your traffic courts with thousands of cases that will be immediately thrown out.
So the sign says "100", the police read your speed at "112" but the device has a 5% MoE and in this case your actual speed was 107. Seems like you have exactly the same problem because the laws state the actual speed limit was "110" which you are under, despite being over the posted limit and the police reading you as over both the real and posted limits.
You will literally get a fine for going three miles per hour over the speed limit in many countries.
True, however the penalty depends on the amount by which the threshold was crossed; in the country I live in at least.
I think the metaphor here would be more like getting your license permanently suspended for going 3 mph over. Whether that happens anywhere or not in reality, the point is, it would be an absurd overreaction.
Not getting the "didn't go over the speed limit" award when you did in fact go over the speed limit shouldn't be a big deal to anyone.
Nobody is preventing the studio from working, or from continuing to make (ostensibly) tons of money from their acclaimed game. Their game didn't meet the requirements for one particular GOTY award, boo hoo
"Sometimes shit happens" should be viable in a courtroom sometimes, but by nature competition rules leave less room for interpretation.
> Sometimes shit happens.
But you’re also not supposed to drive as close to the speed limit as possible. That number is not a target to hit, it’s a wall you should stay within a good margin of.
I understand analogies are seldom flawless, but the speed limit one in particular I feel does not apply because you can get a fine proportional to your infraction (go over the limit a little bit, small fine; go over it a lot, big fine) but you can’t partially retract an award, it’s all or nothing.
No, everybody treats speed limit as expected speed as long as conditions allows it.
Whether “everyone does it” has no bearing on it being what should be done. Most people also speed up on yellow lights, but you should be doing the exact opposite.
This depends on the country. In certain countries, speed limits are set by civil engineers as a true upper limit that one is not supposed to exceed. In others, speed limits are set slightly above the average speed one is expected to drive at.
In the former sort of country, drivers are expected to use their judgement and often drive slower than the limit. In the latter sort of country, driving at the speed limit is rather... limiting, thus it is common to see drivers slightly exceeding the speed limit.
(I have a theory in my head that – in general – the former sort of country has far stricter licensing laws than the latter. I am not sure if this is true.)
The problem I have with the whole "licensing standards" thing is that, for everyday activities for most of the population, it's not realistic to regulate to the point that there are really substantial barriers to entry to the degree there are for flying in general. And experience probably counts for more than making people shell out a couple thousand more for courses.
The usual argument in favor of stricter licensing is coupled with improvement in public transit.
Which is really going to help me living 50 miles outside a major city. (Which is considered urban according to the US Census.)
I believe in giving someone a reasonable amount of time to correct their mistakes. Yes, it was a terrible mistake to release the game in that state, but I think correcting it within days is sufficient.
It's not a "terrible mistake" to accidentally ship placeholder textures. Let's tone it down just a wee bit, maybe.
Anyway, I don't agree with banning generative AI, but if the award show wants to do so, go ahead. What has caused me to lose any respect for them is that they're doing this for such a silly reason. There's a huge difference between using AI to generate your actual textures and ship those, and.... accidentally shipping placeholder textures.
It really illustrates that this is more ideological than anything.
How could a gen AI ban be anything but ideological?
If you ever make a typo on an official document, would you like that to be not correctable and you forever be responsible for the results? Yeah, that's about that level of silly.
Blanket ban on generative AI? Games have been using some form or another every since the days of RTS map generation and perlin noise
That is not the sort of thing people are referring to when they use the term “generative AI”. It’s basically a completely different technology and the ethical concerns around data sourcing and energy usage are not the same at all.
It's extremely tiring how people pretend like there's no difference between these technologies. The comments on the article are the epitome - "oh they used a computer to make a computer game, the horror"
Just a cudgel to shut down discussion
Blanket ban on AI generated assets, but nobody cares about AI generated code apparently. lol
Vibe coded games definitely seem like a lot more of an issue in my books than a few minor textures having been generated.
I don't find it that surprising. The creatives that are against generative AI aren't against it only because it produces slop. They are against it because it uses past human creative labor, without permission or compensation, to generate profit for the companies building the models which they do not redistribute to the authors of that creative labor. They are also against it due to environmental impact.
In that view, it doesn't matter whether you use it for placeholder or final assets. You paying your ChatGPT membership makes you complicit with the exploitation of that human creative output, and use of resources.
They are also against it because they believe it will compete with them and they will get paid less.
That’s also a valid reason to be against it!
I disagree, this is the worst reason to be against it. It's choosing horses over trains. Manual labor over engines, mail over e-mail. It's basically purely egotistical, placing something as fleeting as your current job over the progress of humanity.
The creatives that are the loudest voices against AI for art asset generation in my experience are technically competent but lacking any real pizzazz or uniqueness that would set them apart from generated art, so they feel extremely threatened.
There's also been an extremely effective propaganda campaign by the major entertainment industry players to get creatives to come out against AI vocally. I'd like to see what percentage of those artists made the statement to try and curry favor with the money suits.
Without making a judgment call on quality, it is definitely established artists who rely largely on their technical ability for a living (and their hangers-on) who are most vocal. And they focus on the dual indignities of their style being easily-reproducible in aggregate, but also each individual work having glaring mistakes that they'd never make, while ignoring the actual point of theft - when model builders scraped their work specifically for use in a commercial product.
>There's also been an extremely effective propaganda campaign by the major entertainment industry players to get creatives to come out against AI vocally.
Where can I find out more about this?
Several of the major voices were Disney employees iirc. Disney's goal has always been to have a monopoly on "their" IP, AI applications included.
Except it uses existing art transformatively, which means that even under our absurd, dystopian IP laws, it’s not exploitation. There isn’t a single artist out there who wouldn’t be running afoul of copyright law if that wasn’t the case.
It’s been insane to me to watch the “creative class”, long styled as the renegade and anti-authoritarian heart of society, transform into hardline IP law cheerleaders overnight as soon as generative law burst onto the scene.
And the environmental concerns are equally disingenuous, particularly coming from the video game industry. Please explain to me how running a bunch of GPUs in a data center to serve peoples LLM requests is significantly more wasteful than distributing those GPUs among the population and running people’s video games?
At the end of the day, the only coherent criticism of AI is that it stands to eliminate the livelihood of a large number of people, which is perfectly valid concern. But that’s not a flaw of AI, it’s a flaw of the IP laws and capitalistic system we have created. That is what needs addressing. Trying to uphold that system by stifling AI as a technology is just rearranging deck chairs on the Titanic.
That should be the crux of the issue, and stated plainly.
This is just another scheme where those at the top are appropriating the labor of many to enrich themselves. This will have so many negative consequences that I don't think any reactions against it are excessive.
It is irrelevant whether AI has "soul" or not. It literally does not matter, and it is a bad argument that dillutes what is really going on.
There is still human intentionality in picking an AI generated resource for surface texture, landscape, concept art, whatever. Doubly so if it is someone that create art themselves using it.
I wish we could just land on a remedy for this, specifically. "Everyone who'd ever posted to deviantArt, ArtStation, etc., before they were scraped gets a dividend in perpetuity." And force MANGAF to pay. Finally, a way for their outsize profits to flow to the people who've been getting the shit end of the compensation stick since online art platforms and social media became a thing.
It'll never happen because the grift is the point.
The problem of allowing "placeholder AI assets" is that any shipped asset found to be AI is going to be explained away as being "just a placeholder". How are we supposed to confirm that they never meant to ship the game like this? All we know is that they shipped the game with AI assets.
Adding to that: 'it was a placeholder' has been used to excuse direct (flagrant) plagiarism from other sources, such as what happened with Bungie and their game Marathon
Shouldn't there be an argument for best effort? If the issue has been removed as soon as it has been detected, doesn't it count for something?
Because they can most likely prove the actual assets were on their version control years ago but weren’t applied to the models.
How? We don't have access to their version control. How do you validate an external version control to be accurate and reflective of the state years ago? Git histories can be rewritten as one pleases.
A forensic auditor would find that out, unless they do a full company wide purge of all local and remote Git histories.
But I'm kinda thinking this isn't THAT serious =)
But creating and picking those placeholders used to be somebody's job, maybe a junior artist. Now they're automated off the back of somebody else's work. And here we have an admission, but how many artists are being sidestepped in major games developers now? It won't be long before the EAs and Ubisofts of the world fire theirs. Then it'll be developers. Then it'll just be a committee of dolphins picking balls to feed into a black box that pumps out games.
It doesn't seem strange that an industry award protects the workers in the industry. I agree, it seems harsh, but remember this is just a shiny award. It's up to the Indie Game Awards to decide the criteria.
> But creating and picking those placeholders used to be somebody's job, maybe a junior artist.
Is it really though? After all it's just maybe a junior artist.
I've had to work with some form of asset pipeline for the past ten years. The past six in an actual game though not AAA. In all these years, devs have had the privilege of picking placeholders when the actual asset is not yet available. Sometimes, we just lift it off Google Images. Sometimes we pick a random sprite/image in our pre-existing collection. The important part is to not let the placeholder end up in the "finished" product.
> It's up to the Indie Game Awards to decide the criteria.
True and I'm really not too fond of GenAI myself but I can't be arsed to raise a fuss over Sandfall's admission here. As I said above, the line for me is to not let GenAI assets end up in the finished product.
> But creating and picking those placeholders used to be somebody's job, maybe a junior artist
Realistically, no.
> But creating and picking those placeholders used to be somebody's job, maybe a junior artist.
This argument in this industry is problematic. The entire purpose of computers is to automate processes that are labor intensive. Along the way, the automation process went from doing what was literally impossible with human labor to capturing ever deeper levels of skill of the practitioners. Contrast early computer graphics, which involved intensive human intervention, to modern tools. Since HN almost certainly has more developers than graphics artists, contrast early computer programming (where programmers didn't even have the assistance of assemblers and where they needed a deep knowledge of the computer architecture) to modern computer programming (high level languages, with libraries abstracting away most of the algorithmic complexity).
I don't know what the future of indie development looks like. In a way, indie development that uses third-party tools that captures the skills of developers and graphics artists traditionally found in major studios doesn't feel very indie. On the other hand, they are already doing that through the use of game engines and graphics design/modelling software. But I do know that a segment of the industry that utterly ignores those tools will be quickly left behind.
It's bad because it takes someone's job? However, that job was mundane petty work that seniors didn't want to bother with. Were cars terrible for taking all of those stableboy jobs? Is Excel or data engineering terrible for the obliteration of data entry and low level bookkeeping jobs? Or is it not just a slippery slope argument, when what's happening is IMO evolution of tech? IMO People will adapt. While it's up to any event organizers to decide their rules, AI witch-hunts are a Luddite response. AI/LLM can be major tools in the belt of indies to dethrone AAA. I'd like to be clear that I'm arguing in favor of tooling such as the example of placeholder usage and a pipeline to remove it. I wouldn't defend a scumbag leveraging AI to ripoff another game, artist, or dev. It just seems like the lines are being blurred to justify AI witchhunts.
The game industry, especially AAA, is actually having major identity crisis right now as technology evolves and jobs adapt around the new tool of AI/LLMs. The game awards (not indie) should demonstrate this dolphin committee you fear already exists because the limiting factor in all industries are major resources: time, capital, experience. AI/LLMs will enable far more high skill work to be accomplished with less experience, time, and possibly capital (sidestepping ethics/practicality of data centers).
It's not about the asset it's about them first claiming that they did not use gen ai during production. One is an oopsie and the other a blatant lie. If the award requirements say you can't participate if you used generative AI and you lie about it it's a pretty clear cut case. Either be certain you don't ship AI placeholders or just don't lie. The outrage in this thread and hyper focusing on the asset instead of the lie is the problem.
I would consider myself pretty embedded in the gaming space, and I hadn't heard of the "Indie Game Awards" before yesterday. Last year's award show has <100k views on youtube, and the first article mentioning this (insider-gaming.com's) is written by one of the judges involved.
I'll leave it up to the reader to judge how much of this is genuine and how much is jumping the twitter bandwagon to boost the award show's popularity.
> I would consider myself pretty embedded in the gaming space, and I hadn't heard of the "Indie Game Awards" before yesterday
Very curious to hear what channels you follow and how often per week. My RSS feed was spammed by it this and previous years
I'm not trying to defend "The Indie Game Awards", which I also have never heard of, but The Game Awards are universally acknowledged to be a joke, and always have been. By runtime, it's 80% soulless, samey trailers for AAA games, 10% Imagine Dragons, 5% rooting for that coked-up clarinet player on the edge of the orchestra, and 5% Jeff Keighley rapid-firing off the winners of made-up award categories in under five seconds each.
There was no judgement in his comment.
He just pointed out (correctly) that the game awards that were being spoken of everywhere for the last few weeks were not the one related to this article.
I on the other hand will add a judgement to this discussion: if you consider the game awards a joke, which is the by far most watched event in gaming, eclipsing (by viewer count) other entertainment events in sports such as the NBA finals... You've certainly got "interesting" opinions.
Re: Judgement:
I think it's exactly their popularity that lead people to call the big awards shows "a joke." Pretty common with stuff like the Emmys and Grammys.
Indeed I was confused. I am an idiot and sorry!
I don’t think the name confusion can really be blamed on “The Indie Game Awards”, it has to be on “The Game Awards” for choosing the most generic possible name.
Mostly daily browsing of twitter and reddit, r/livestreamfail + various discord communities. Note that i have heard lots about The Game Awards, but this is a different event.
FYI, last year was the first ever Indie Game Awards.
I'll admit I'm not nearly as in the gaming space as I once was, but this is a sentiment I've heard repeated numerous times from friends who still are.
That and utter surprise of the lack of recognition they had towards Silksong.
There is a small irony that the Indie Game Awards rejects nominations of games using AI but The Game Awards does not. It is independent teams of developers who are less likely to be able to afford to pay an artists who may be able to produce something of value with AI assets that they otherwise would not have the resources for. On the other side, it is big studios with a good track record and more investment who are more likely to be able to pay artists and benefit from their artistry.
To me, art is a form of expression from one human being to another. An indie game with interesting gameplay but AI generated assets still has value as a form of expression from the programmer. Maybe if it's successful, the programmer can afford to pay an artist to help create their next game. If we want to encourage human made art, I think we should focus on rewarding the big game studios who do this and not being so strict on the 2 or 3 person teams who might not exist without the help of AI.
(I say this knowing Clair Obscur was made by a large well respected team so if they used AI assets I think it's fair their award was stripped. I just wish The Game Awards would also consider using such a standard.)
I agree that this holds in theory, but in practice? All the overhyping of AI I've heard from the gaming sector has come from the big studios, not indies. And, as you point out, Clair Obscur isn't the 'most indie' of indies anyway.
Maybe the small studios just use it as a tool and don’t need to hype it to make it look good for stock holder interests.
That's what semi-recent whining about Larian saying they use AI was about.
They just use it to cut some of the boring work and iterate over some ideas, once the idea is in stone actual artist does it.
I don't see the problem because it isn't cutting more artists out of the loop, if anything they get more of the meaningful work
Who is hyping the technology doesn't seem to be too relevant. Big studios have a bigger megaphone and, as another has pointed out, possibly even a financial motivation for shouting it from the rooftops for their investors to hear.
It doesn't have to be hyped to be used, for example today I found these two building their passion project using GenAI, which would otherwise maybe not possible, who knows: https://reddit.com/comments/1prqfsu
Simple fix, they need a separate categories for a game art award—no AI—and the rest of the categories (perhaps including game of the year, best new game) should allow AI.
Right now the rules they're using are going against larger forces in the world that are going to become standard (if they're not already).
And to your point, these are indie developers that are David's going up against the AAA Goliath's that have a bottomless purse with which to shower money on a "product". I dabble in art (and wrote some indie games decades ago) and I am fine with AI-generated art (despite my daughters' leanings in the opposite direction).
I'd agree if this were about The Game Awards or similar, where indie devs are expected to compete against the AAA goliaths, but I've always understood the Indie Game Awards as being more about the craft then the end product.
From the FAQ:
> The ceremony is developer-focused, with game creators taking center stage to present awards, recognize their peers, and openly speak without the pressure of time constraints.
https://www.indiegameawards.gg/faq
Regardless of AI's inevitability, I don't particularly care to celebrate it's use, and I think that's the most reasonable stance for a developer focused award to take.
That's a good point—this being the indie game awards. I still think it makes sense to have separate categories that allow for AI-generated content but "indie developed" (versus an "Indie Art Award" that absolutely prohibits AI-generated content).
We should be able to celebrate the creation, execution, concept of a game without letting AI assets nullify the rest.
It's the same thing as local restaurants being picky about using organic and environmentally sustainable ingredients while big chain corporations have a preference for low cost ingredients that strip the environment bare. The big corporations could afford organic stuff, but their aim is to just get a product out there and get it done cheaply. The local restaurant can't often compete on price alone, so they sell themselves as being made with care for the consumer. Selling one's product as a moral option has been a fairly reliable marketing tactic for a long time and I'm kind of surprised it's taken this long to enter the gaming industry.
There's not that much irony considering how people into indie games are more about the art and craft of video games, whereas The Game Awards is a giant marketing cannon for the video game industry, and the video game industry has always been about squeezing their employees. If they can hire fewer artists and less QA because of GenAI, they're all for it.
Just two days ago there were reports that Naughty Dog, a studio that allegedly was trying to do away with crunch, was requiring employees to work "a minimum of eight extra hours a week" to complete an internal demo.
https://bsky.app/profile/jasonschreier.bsky.social/post/3mab...
> To me, art is a form of expression from one human being to another. An indie game with interesting gameplay but AI generated assets still has value as a form of expression from the programmer.
How though? If questions about style or substance can be answered with "because the AI did it, its just some stochastic output from the model" I don't see how that allows for expression between humans.
Because a human selected it for you to see. If I send you a book to read, the book still has content and value even if I didn't write it.
In this case, you'd be judging the AI made assets as simply AI made and the human made gameplay and programming as human made. I'm not suggesting the AI assets would be transformed into art just because they are part of some human creative work.
You’re not wrong, but I think a hardline stance is pragmatic for keeping AI out while it’s not yet normalized.
There's an interesting question about scope.
The IGA FAQ states, in its entirety on this topic: "Games developed using generative AI are strictly ineligible for nomination." [1]
Sandfall probably interpreted this reasonably: no AI assets in the shipped product. They say they stripped out AI placeholders before release (and patched the ones they missed). But the IGA is reading it strictly: any use during development disqualifies.
If that's the standard, it gets interesting. DLSS and OptiX are trained neural networks in an infrastructure-shaped raincoat—ML models generating pixels that were never rendered. If you used Blender Cycles with OptiX viewport denoising while iterating on your scenes, you "developed using generative AI."
By a strict reading, any RTX-enabled development pipeline is disqualifying. I wonder if folks have fully thought this through.
[1] https://www.indiegameawards.gg/faq (under "Game Eligibility")
Before people chime in to claim "That isn't what we meant"...
DLSS and Cycles denoising are, well, denoising. It's the same process as denoising in Stable Diffusion, essentially, and was trained in the same way.
I bet if they'd only used AI assisted coding would be a complete non-event, but oh no, some inconsequential assets were generated, grab the pitchforks!
I’d take that bet against you.
The challenge of course is determining if AI was used in the coding.
Ok great, but you don't really say much.
You think there’s any non niche game developers not using a coding assistant at this point? You think Epic is not using code assistants to develop Unreal Engine?
It is a non event for consumers the only ones who care much are artists.
As always the market decides.
Maybe, but that is a different issue.
The use of generative AI for art is being rightfully criticised because it steals from artists. Generative AI for source code learns from developers - who mostly publish their source with licenses that allow this.
The quality suffers in both cases and I would personally criticise generative AI in source code as well, but the ethical argument is only against profiting from artists' work eithout their consent.
> rightfully criticised because it steals from artists. Generative AI for source code learns from developers
The double standard here is too much. Notice how one is stealing while the other is learning from? How are diffusion models not "learning from all the previous art"? It's literally the same concept. The art generated is not a 1-1 copy in any way.
IMO, this is key to the issue, learning != stealing. I think it should be acceptable for AI to learn and produce, but not to learn and copy. If end assets infringe on copyright, that should be dealt with the same whether human- or AI-produced. The quality of the results is another issue.
> I think it should be acceptable for AI to learn and produce, but not to learn and copy.
Ok but that's just a training issue then. Have model A be trained on human input. Have model A generate synthetic training data for model B. Ensure the prompts used to train B are not part of A's training data. Voila, model B has learned to produce rather than copy.
Many state of the art LLMs are trained in such a two-step way since they are very sensitive to low-quality training data.
> The art generated is not a 1-1 copy in any way.
Yeah right. AI art models can and have been used to basically copy any artist’s style many ways that make the original actual artist’s hard work and effort in honing their craft irrelevant.
Who profits? Some tech company.
Who loses? The artists who now have to compete with an impossibly cheap copy of their own work.
This is theft at a massive scale. We are forcing countless artists whose work was stolen from them to compete with a model trained on their art without their consent and are paying them NOTHING for it. Just because it is impressive doesn’t make it ok.
Shame on any tech person who is okay with this.
Copying a style isn’t theft, full stop. You can’t copyright style. As an individual, you wouldn’t be liable for producing a work of art that is similar in style to someone else’s, and there is an enormous number of artists today whose livelihood would be in jeopardy if that was the case.
Concerns about the livelihood of artists or the accumulation of wealth by large tech megacorporations are valid but aren’t rooted in AI. They are rooted in capitalism. Fighting against AI as a technology is foolish. It won’t work, and even if you had a magic wand to make it disappear, the underlying problem remains.
I really don't agree with this argument because copying and learning are so distinct. If I write in a famous author's style style and try to pass my work off as theirs, everyone agrees that's unethical. But if I just read a lot of their work and get a sense of what works and doesn't in fiction, then use that learning to write fiction in the same genre, everyone agrees that my learning from a better author is fair game. Pretty sure that's the case even if my work cuts into their sales despite being inferior.
The argument seems to be that it's different when the learner is a machine rather than a human, and I can sort of see the 'if everyone did it' argument for making that distinction. But even if we take for granted that a human should be allowed to learn from prior art and a machine shouldn't, this just guarantees an arms race for machines better impersonating humans, and that also ends in a terrible place if everyone does it.
If there's an aspect I haven't considered here I'd certainly welcome some food for thought. I am getting seriously exasperated at the ratio of pathos to logos and ethos on this subject and would really welcome seeing some appeals to logic or ethics, even if they disagree with my position.
No, the only difference is that image generators are a much fuller replacement for "artists" than for programmers currently. The use of quotation marks was not meant to be derogatory, I sure many of them are good artists, but what they were mostly commissioned for was not art - it was backgrounds for websites, headers for TOS updates, illustrations for ads... There was a lot more money in this type of work the same way as there is a lot more money in writing react sites, or scripts to integrate active directory logins in to some ancient inventory management system than in developing new elegant algorithms.
But code is complicated, and hallucinations lead to bugs and security vulnerabilities so it's prudent to have programmers check it before submitting to production. An image is an image. It may not be as nice as a human drawn one, but for most cases it doesn't matter anyway.
The AI "stole" or "learned" in both cases. It's just that one side is feeling a lot more financial hardship as the result.
Finally a good point in this thread.
There is a problem with negative incentives, I think. The more generative AI is used and relied upon to create images (to limit the argument to inage generation), the less incentive there is for humans go put in the effort to learn how to create images themselves.
But generative AI is a deadend. It can only generate things based on what already exists, remixing its training data. It cannot come up with anything truly new.
I think this may be the only piece of technology humans created that halts human progress instead of being something that facilitates further progress. A dead end.
> Generative AI for source code learns from developers - who mostly publish their source with licenses that allow this.
I always believed GPL allowed LLM training, but only if the counterparty fulfills its conditions: attribution (even if not for every output, at least as part of the training set) and virality (the resulting weights and inference/training code should be released freely under GPL, or maybe even the outputs). I have not seen any AI company take any steps to fulfill these conditions to legally use my work.
The profiteering alone would be a sufficient harm, but it's the replacement rhetoric that adds insult to injury.
> Generative AI for source code learns from developers - who mostly publish their source with licenses that allow this.
As far as I'm concerned, not at all. FOSS code that I have written is not intended to enrich LLM companies and make developers of closed source competition more effective. The legal situation is not clear yet.
To me, if the AI is trained on GPLv3/AGPL code, any code it generate should be GPLv3/AGPL too, the licence seems clear imho.
FOSS code is the backbone of many closed source for-profit companies. The license allows you to use FOSS tools and Linux, for instance, to build fully proprietary software.
Well, if its GPL you are supposed to provide the source code to any binaries you ship. So if you fed GPL code into your model, the output of it should be also considered GPL licensed, with all implications.
Sure, that usage is allowed by the license. The license does not allow copying the code (edit: into your closed-source product). LLMs are somewhere in between.
"Mostly" is doing some heavy lifting there. Even if you don't see a problem with reams of copyleft code being ingested, you're not seeing the connection? Trusting the companies that happily pirated as many books as they could pull from Anna's Archive and as much art as they could slurp from DeviantArt, pixiv, and imageboards? The GP had the insight that this doesn't get called out when it's hidden, but that's the whole point. Laundering of other people's work at such a scale that it feels inevitable or impossible to stop is the tacit goal of the AI industry. We don't need to trip over ourselves glorifying the 'business model' of rampant illegality in the name of monopoly before regulations can catch up.
I'm not sure how valid it is to view artwork differently than source code for this purpose.
1. There is tons of public domain or similarly licensed artwork to learn from, so there's no reason a generative AI for art needs to have been trained on disallowed content anymore than a code generating one.
2. I have no doubt that there exist both source code AIs that have been trained on code that had licenses disallowing such use and art AIs have that been trained only on art that allows such use. So, it feels flawed to just assume that AI code generation is in the clear and AI art is in the wrong.
> The use of generative AI for art is being rightfully criticised because it steals from artists. Generative AI for source code learns from developers - who mostly publish their source with licenses that allow this.
This reasoning is invalid. If AI is doing nothing but simply "learning from" like a human, then there is no "stealing from artists" either. A person is allowed to learn from copyright content and create works that draw from that learning. So if the AI is also just learning from things, then it is not stealing from artists.
On the other hand if you claim that it is not just learning but creating derivative works based on the art (thereby "stealing" from them), then you can't say that it is not creating derivative works of the code it ingests either. And many open source licenses do not allow distribution of derivative works without condition.
Everyone in this thread keeps treating human learning and art the same as clearly automated statistical processes with massive tech backing.
Analogy: the common area had grass for grazing which local animals could freely use. Therefore, it's no problem that megacorp has come along and created a massive machine which cuts down all the trees and grass which they then sell to local farmers. After all, those resources were free, the end product is the same, and their machine is "grazing" just like the animals. Clearly animals graze, and their new "gazelle 3000" should have the same rights to the common grazing area -- regardless of what happens to the other animals.
Most OS licenses requires attribution, so AI for code generation violates licenses the same way AI for image generation does. If one is illegal or unethical, then the other would be too.
Is there a OSS licence that excludes LLM?
I'm not sure about licenses that explicitly forbid LLM use -- although you could always modify a license to require this! -- but GPL licensed projects require that you also make the software you create open source.
I'm not sure that LLMs respect that restriction (since they generally don't attibute their code).
I'm not even really sure if that clause would apply to LLM generated code, though I'd imagine that it should.
Very likely no license can restrict it, since learning is not covered under copyright. Even if you could restrict it, you couldn't add a "no LLMs" clause without violating the free software principles or the OSI definition, since you cannot discriminate in your license.
"Learning" is what humans can do. LLMs can't do that.
“Learning” as a concept is too ill defined to use as a distinction. What is learning? How is what a human does different from what an LLM does?
In the end it doesn’t matter. Here “learning” means observing an existing work and using it to produce something that is not a copy.
They don't require it if you don't include OSS artifacts/code in your shipped product. You can use gcc to build closed source software.
> You can use gcc to build closed source software
Note that this tends to require specific license exemptions. In particular, GCC links various pieces of functionality into your program that would normally trigger the GPL to apply to the whole program, and for this reason, those components had to be placed under the "GCC Runtime Library Exception"[1]
[1]: https://www.gnu.org/licenses/gcc-exception-3.1.html
those that require attribution
so... all of them
> The quality suffers in both cases
According to your omnivision?
The game won GOTY on its merits. Then the AI disclosure came out and it got stripped. If AI use produces obviously inferior work, how did it win in the first place? Seems like the objection is to the process, not the result.
Doubly so if the usage was de minimis.
I think it's the artists, not the tools, that make the art. Overuse of anything is gauche; but I am confident that beautiful things can be made with almost any tool, in the hands of a tasteful artist.
> Seems like the objection is to the process, not the result.
Right. The game is not eligible for the award. This is not a comment on the quality of the game.
The Indie Game Awards require zero AI content. The devs fully intended to ship without AI content but made a mistake, disqualifying themselves for the award. This is simply how competition rules work. I have a friend who plays competitive trading card games, and one day he showed up to a national event with an extra card in his box after playing with some friends late at night. It was an honest mistake, and the judges were quite sympathetic, but he was still disqualified!
BTW, the game is incredible.
> I think it's the artists, not the tools, that make the art.
I've never liked this argument. If AI is a tool, then having my own personal woodworker on staff makes me a woodworker too.
I've never liked the argument that there's some imaginary line between the acceptibility of AI as a tool for creating art and Photoshop/Krita/Procreate/etc as a tool for creating art.
Rubbing a brush on a canvas was good enough for the renaissance masters, why are we collectively okay with modern "artists" using "virtual brushes" and trivializations of the expressive experience like "undo" when it's not "real art" because they're leaning so heavily on the uncaring unthinking machine and the convenience in creation it offers rather than suffering through the human limitations that the old masters did? Are photographers not artists too then, because they're not actually creating, just instead capturing a view of what's already there?
The usual response to this is some trite response about how AI is 'different' because you're 'just' throwing prompts at it and its completely creating the output itself -- as if it's inconceivable that there might be someone who doesn't just shovel out raw outputs from an AI and call it 'art' and is instead actually using it in a contributatory role on a larger composition that they, themselves, as a human, are driving and making artistic decisions on.
E33 is a perfect example here. Is the artistic merit of the overall work lessened by it having used AI in part of its creation? Does anyone really, truly believe that they abdicated their vision on the overall work to machines?
Just because someone can drag and drop to draw a circle in an image editing app instead of using their own talent and ability to freehand it instead doesn't mean what they then go on to do with that circle isn't artistic.
I agree with you, and I frankly wasn't trying to reopen the can of worms about AI & art. As I said, I just don't like that particular line of reasoning about AI usage.
Like most things, art exists on a spectrum and there are many levels. Most would say a single pixel isn't art, yet at some point many cross some invisible line where it becomes art. Likewise, at some point a bunch of logic and pixels become a best-selling indie game. It's more than the sum of its parts, and I don't agree with saying that sum is suddenly less just because one of those parts was AI generated. The sum should logically be the same value regardless.
But then that's a very mathematical way of looking at it. Art and the appreciation of it has never been logical, but instead emotional. AI invokes negative emotions in many people, and so the art is diminished in their eyes. This makes sense to me.
However, I don't necessarily agree with this approach of yanking back the award. It reeks of horse buggy whip manufacturers trying to push back the tide. But then I've never understood comparing one piece of art to another and declaring one the winner. If art is simply something that invokes emotion in the viewer, and everyone's emotional response is different, it makes no sense to have awards to me.
We do make this argument all the time though. Film is probably the number one example in my mind. After actors, we celebrate directors more than any other individual in film. Directors often don’t write the script. They don’t handle the camera or the lighting or sound. They don’t create the music. They don’t do the editing in post. They don’t do the acting. But they do direct all of the people doing those things to achieve an overall vision, and we recognize that has significant artistic merit. Directors are not artists, or cinematographers, or composers, or actors, or visual effects artists, or sound technicians. But they are still artists, because art is more than the technical skill to produce something.
just because auteur theory is popular doesn't mean it's correct
I dislike this argument only when taken to the extreme 'GenAI allows anyone to create arts'. It's like saying 'Ikea allow anyone to be a woodworker'!
I love debating, but I want debate to learn things, not to walk people into traps.
Obviously the woodworker is a person. And you would be on a team that has woodworking as part of their skillset.
But the way you set up your reductio-ad-absurdum it can be read as implying the AI is a person too. O:-)
You know what, rather than just going for a flip rhetorical takedown, what if we took that implication seriously for a second?
What if you did mean to argue that (the) AI is a proto-person. Say you argue that they deserve to be in the credits as a (junior?) member of the team. That'd be wild! A really interesting framing, which I haven't heard before.
Or the weaker version: Use said framing pro-forma as a (practical?) legal fiction. We already have rules on (C) attribution. It might be a useful framing to untangle some of the spaghetti.
> If AI use produces obviously inferior work, how did it win in the first place?
they uses some AI placeholders during development as it can majorly speed up/unblock the dev loop while not really having any ethical issues (as you still hire artists to produce all the final assets) and in some corner case they forgot to replace the place holder
also some of the tooling they might have used might technically count as gen AI, e.g. way before LLM became big I had dabbled a bit in gen AI and there where some decent line work smoothing algorithms and similar with non of the ethical questions. Tools which help removing some dump annoying overhead for artists but don't replace "creative work". But which anyway are technical gen AI...
I think this mainly shows that a blank ban on "gen AI" instead of one of idk. "gen AI used in ways which replaces Artists" is kinda tone deaf/unproductive.
> AI placeholders during development as it can majorly speed up/unblock
Zero-effort placeholders have existed for decades without GenAI, and were better at the job. The ideal placeholder gives an idea of what needs to go there, while also being obvious that it needs to be replaced. This [1] is an example of an ideal placeholder, and it was made without GenAI. It's bad, and that's good!
[1] https://www.reddit.com/r/totalwar/comments/1l9j2kz/new_amazi...
A GenAI placeholder fails at both halves of what a placeholder needs to do. There's no benefit for a placeholder to be good enough to fly under the radar unless you want it to be able to sneak through.
I've actually considered hiring artists to help me out a few times too under sort of comparable circumstances? I could use AI to generate basic assets, and then hire artists for the real work! More work for artists, better quality for me. Unfortunately, I fear I'd get yelled at (possibly as a traitor to both sides?)
Frankly, in the wider debate, I think engagement algorithms are partially to blame. Nuanced approaches don't get engagement, so on every topic everyone is split into two or more tribes yelling at each other. Folks in the middle who just want to get along have a hard time.
(Present company excepted of course. Dang is doing a fine job!)
Gamer social movements always burn bright at first, then die when they demand too much purity to reconcile with the fundamental truths: people want to make games and, when they're good, people want to play them. Trying to stop people from using (even experimenting with!) new tools is doomed, just like the old attempts to boycott games over their business models or their creators' politics/sexuality/whatever.
My objection to LLMs is the same that I had for TDD. There's all these people saying that you just gotta try it, but when I do, the effect is lesser than just using my preexisting skills. Oh, it's not for you? Wrong, here's some tautological or contradictory or poetic or nonsensical advice that'll be 'the wrong way' a week from now.
Does TDD and LLMs have a kernel of utility in them, yeah, I don't see why not. But what the majority of people are saying doesn't seem to be true and what the minority of people I can actually see using them 'for reals' are doing just doesn't applicable to anything I care about.
With that in mind, the only thing less real to me than a tool that I have to vibe with at a social zeitgeist level to see benefits from is an award when I already have major financial and industrial success.
Half the people in my team has played the game. For months all I would hear about w.r.t. games was how this game was smashing milestones and causing the entire industry to do some soul searching or putting their fingers in their ears.
I'm sure they can console themselves from having lost this award with their piles of money.
[An LLM did help me with a cryptography api that was pretty confusing, but I still had to problem solve that one because it got a "to bytes" method wrong. So... once in a blue moon acceleration for things I'm unfamiliar with, maybe.]
Is anyone else detecting a phase shift in LLM criticism?
Of course you could always find opinion pieces, blogs and nerdy forum comments that disliked AI; but it appears to me that hate for AI gen content is now hitting mainstream contexts, normie contexts. Feels like my grandma may soon have an opinion on this.
No idea what the implications are or even if this is actually something that's happening, but I think it's fascinating
LLMs has had a couple of years by now to show their usefulness, and while hype can drive it for a while, it's now getting to the point where hype alone can't. It needs to provide a tangible result for people.
If that tangible result doesn't occur, then people will begin to criticize everything. Rightfully so.
I.e., the future of LLMs is now wobbly. That doesn't necessarily mean a phase shift in opinion, but wobbly is a prerequisite for a phase shift.
(Personal opinion at the moment: LLMs needs a couple of miracles in the same vein as the discovery/invention of transformers. Otherwise, they won't be able to break through the current fault-barrier which is too low at the moment for anything useful.)
No, AFAICT, AI hate has been common (but not the majority position, and still not) in normie contexts for a while.
You’re reading it wrong: rather, AI hype had been common (but not the majority position) in tech contexts for a while, especially from those that have something to sell you.
What you derogatorily call normies are the rest of the world caring about their business until one day some tech wiz came around to say “hey, I have built a machine to replace all of you! Our next goal is to invent something even smarter under our control. Wouldn’t that be neat?” No wonder the average person isn’t really keen on this sort of development.
> You’re reading it wrong
No, I don't think I am.
> AI hype had been common (but not the majority position) in tech contexts for a while, especially from those that have something to sell you.
There's a whole lot of that for quite a long time targeting normie contexts, too; in fact, the hate in normie contexts is directly responsive to it, because the hype in normie contexts is a lot of particularly clumsy grifting plus the nontechnical PR of the big AI vendors (which categories overlap quite a bit, especially in Sam Altman’s case), and the hate in normy contexts shows basically zero understanding of even what AI is beyond what could be gleaned from that hyper plus some critical pieces on broad (e.g., total water and energy use, RAM price) and localized (e.g., from fossil fuel power plants in poor neighborhoods directly tied to demand from data centers) economic and environmental impacts.
> What you derogatorily call normies
I am not using “normie” derogatorily, I am using it to contrast to tech contexts.
The most typical reactions I see outside of techie and arty spaces where people are most polarised about it are:
- annoyance at stupid AI features being pushed on them
- Playing around with them like a toy (especially image generation)
- Using them for work (usually writing tasks), to varying degrees of effectiveness to pretty helpful to actively harmful depending on how much of a clue they have in the first place.
Discussion or angst about the morality of training or threats to jobs doesn't really enter much into it. I think this apathy is also reflected in how this has not seemingly affected the sales of this game at all in the months that it has been reported on in the video game press. I also think this is informed by how most people using them can fairly plainly see they aren't really a complete replacement for what they actually do.
They don't call normies derogatorily, they just use it as proxy for "non-tech people"
> “hey, I have built a machine to replace all of you! Our next goal is to invent something even smarter under our control. Wouldn’t that be neat?” No wonder the average person isn’t really keen on this sort of development.
Nope, most are just annoyed from AI slop bombarding them at every corner, AI scams getting news of claiming another poor grandma, and AI tech industry making shit expensive. Most people's job are not in current direct threat of being employed, unless you work in tech or art.
Job replacement and AI slop are both legitimate reason that people have negative opinions on AI
Amongst many other legitimate reasons.
It is fascinating. It's showing of course that AI has gone mainstream.
There was a time that I remember when you could gripe at a party about banner ads showing up on the internet and have a lot of blank stares. Or ask someone for their email address and get a quizzical look.
I pointed my dad to ChatGPT a few days ago and instructed him on how to upload/create an AI image. He was delighted to later show me his AI "American Gothic" version of a photo of him and his current wife. This was all new to him.
The pushback though I think is going to be short-lived in a way other push-backs were short-lived. (I remember the self-checkout kiosk in grocery stores were initially a hard sell as an example.)
How many American Gothic AI fake photos do you think he'll make. Sounds like a novelty experience to me. I also loved my first day in Apple's Vision Pro. It was mind blowing. On the 4th day I returned it. Novelty wears off, no matter how cool it might seem initially.
Oh, not disagreeing with you. A strange thing has happened inn the past when the what was novel also becomes the commonplace. Not in all cases, of course (and I personally also believe VR is one of those things that will never become commonplace).
It’s the usual “I don’t like it, I’m against, but it’s okay if I use it” thing. People understand the advantage it gives a person over another one, so they will still use it here and there. You’ll have some people who will be vehemently against it, but it will be the same as people who categorically against having smartphones, or avoiding using any Meta products because of tracking and etc.
People were told by other people to dislike LLMs and so they did, then told other people themselves.
Ha! You’re actually exactly right.
We’ve observed this in AI gen ads (or “creatives” as ad people call them)
They work really well, EXCEPT if there is a comment option next to the ad - if people see others calling the art “AI crap” the click rate drops drastically :)
I think that's a hint that people already dislike AI ads on principle but it's good enough now to fool them, and the comment section provides transparency.
If I was vegan and found out after the fact that a meal that I enjoyed contained animal products in it that doesn't mean I'm some hypocrite for consuming it at the time. Whether I enjoyed it or not at the time it still breaches some ethical standard I have, abstaining from it from then on would be the expected outcome.
yes, having some transparency is terrible to PR
Just like feminism when it was starting, back then millions of women believed it was silly for them to vote, and those who believed otherwise had to get loud to get more on their side, and that's one example, similar things have happened with hundreds other things that we now take for granted, so it's value as judgment measure it's very low by itself alone.
That’s a bad faith argument using weasel words. Do not assume everyone who disagrees with you is an unthinking tool.
https://xkcd.com/610/
Look at how easy it is to make the argument in the other direction:
> People were told by large companies to like LLMs and so they did, then told other people themselves.
Those add nothing to the discussion. Treat others like human beings. Every other person on the planet has an inner life as rich as yours and the same ability to think for themselves (and inability to perceive their own bias) that you do.
Just as they were told to like them in the first place. A lot of this is driven that way because most of the public only has a surface-level understanding of the issues.
It's because amount of AI slop bombarding people from every side increased and created knee-jerk reaction to anything AI, even if it is actually the "remove the boring part of work"
The issue with "removing the boring part of work" is that which part of the work is "boring" is subjective. There are going to be plenty of people that don't think that what they do is the "boring stuff that should be automated away." Whether this is genuine enjoyment for what they do or just an attempt to protect their career, both are valid feelings to have.
all news is prophesying that everyone is going to lose their jobs to "AI"
along with news about "AI" causing electricity bills to rise
every form of media is overrun and infested with poor quality slop
garbage products (microsoft copilot) forced on them and told by their bosses to use it, or else
gee I wonder why normal people hate it
The art bubble is generally considered more "normie" than the tech bubble and they've been strongly anti AI art for longer than even the introduction of the original GitHub copilot
It feels like a similar trend to the one that NFTs followed: huge initial hype, stoked up by tech bros and swallowed by a general public lacking a deep understanding, tempered over time as that public learns more of the problematic aspects that detractors publicise.
I don't feel NFTs ever really had much interest among the general public - average reaction just being "I don't get it, that sounds pointless".
Whereas AI seemed to have a pretty good run for around a decade, with lots of positive press around breakthroughs and genuine interest if you showed someone AI Dungeon, DALL-E 2, etc. before it split into polarized topic.
I think this comparison makes little sense, as in the case of AI there is some actual impactful substance backing the hype.
NFTs have way less downsides than LLMs and GenAI, since the main downside was just wasting electricity. I didn't have to worry about someone cloning my voice and begging my mom on the phone for money.
If you look at daytime TV in the UK, there are a lot of ads targeting the elderly talking about funeral cover and life assurance and so on.
I for one cannot wait for a future where grandparents get targeted ads showing their grandchildren, urging them to buy some product or service so their loved ones have something to remember them by...
Read the other comments in the thread lol-
“Fuck artists, we will replace them”
This is not a winning PR move when most normal people are already pretty pro-artist and anti tech bro
Typical brigading, same with blm, woke, right wing, etc.
Wow you do mentally group things efficiently, that much I can say.
You need to separate AI usage from automating certain parts of a pipeline from end to end creation.
Taking a scorched earth approach to AI usage is just being a luddite.
I don't think this argument is going to be very compelling. The people whom you would be trying to convince here, would just argue that the Luddites were correct in their fight against human labor being displaced. They'll argue that the power artisans had over their work was diminished with the advent of the loom, just like the power artists have over their labor is being diminished right now.
That's very easy problem to solve, take the things from them that were created thanks to problems, they will change mind quickly
I don't disagree with your point, but regardless of how you or I feel about it, this flap will likely seem quaint a decade from now. It's the unstoppable way the world is moving.
I'd love for them to create a separate category for "Best non-AI game". They can fight it out over that award. Perhaps then in a decade or so they will quietly let the award category fade away.
No you are just not understanding the difference between changing and optimising workflow from displacing creativity and artists.
You don't really need to win an argument with luddites. Completely rejecting extremely useful technology and then picking a fight with people who don't is a way to speedrun losing, whether you have "compelling arguments" or not. If the Luddites were correct, they wouldn't be dead.
So living or dying proves right and wrong? So, if the Polish were right the Germans wouldn't have massacred millions of them? What a garbage position.
What's wrong with being a Luddite?
If a fraction of the AI money would go into innovative digital content creation tools and workflows I'm not sure AI would be all that useful to artists. Just look at all those Siggraph papers throughout the years that are filled with good ideas but lacked the funding and expertise to put a really good ui on top.
This is crazy. Tools like photoshoot have gen ai tools in them. Does that mean that Photoshop is now a minefield for artists? If a single artist uses the wrong tool once they disqualify the entire final product for awards, even if the asset is fully removed on the final build.
Ultimately this move might have just been to increase visibility for an otherwise niche awards show (which it has clearly done). Also by eliminating the obvious best indie game of the year -- it opens up the field a bit to more "normal" contenders. Expedition 33 is basically a AAA-quality game, its only considered "indie" because a small unknown team made it.
IDEs now have “AI” autocomplete; will a game become ineligible if a single dev accidentally presses tab instead of writing the whole function by hand? If a script writer uses ChatGPT to generate ideas, straight up ban?
Where does the organisation intend to draw the line?
Better blacklist Google as well. You don't want anyone on the team searching anything on Google lest their search accidentally triggers the LLM response (meaning: they prompted Google Gemini).
They don't. Because selling hate does not draw lines.
> Where does the organisation intend to draw the line?
The answer to this question is always "somewhere". Just because I can't proclaim an exact number of trees that constitute a forest doesn't mean the concept doesn't exist.
> doesn't mean the concept doesn't exist.
No, but it becomes a dubious concept when you define forests as a collection of only conifer trees and that deciduous trees don't count for the definition of a forest.
Sorry for the... clicko? Thanks for posting the right one.
man, a reddit clone that reads exactly like reddit. what's the point, even?
> what's the point, even?
Learning. From the website:
> I started it on January 13, 2023, to learn something new and improve my GNU/Linux skills.
https://bloat.cat/about/
Also, not relying on a single service for one thing is a good thing, as Reddit itself demonstrated when they closed off API access.
I didn't mean the layout, but the comments - the same basic bitch "capitalism bad" that is perfectly acceptable to chant ad nauseam on reddit proper.
That's because it's just a Reddit frontend.
As an indie developer, I take much more issue with E33 falling under the “Indie” category than them using AI.
This is like disqualifying Banksy because they use stencils
To be consistent, if you wish to protect workers by rejecting artificially produced assets, you should feel the same about textiles produced by industrial machinary. Either this decision was wrong or the Luddites had a good point.
Sure, but for the body of folks offering a gaming award, there is little power they have over the textile industry.
To others you may be addressing, I suspect they would say the ship has already sailed on textiles. Perhaps they are trying to sink this ship before it sails.
If the product is not made from material dug out from ground or plants or animals by only bare hands. And I mean bare hands. Is it even worth buying?
If the worker isn’t suffering because of useless manual work, I don’t want to buy it.
Machines? Bah, humbug!
/s
When it comes to AI im more of a luddite at the moment, things change like every 6 months when it comes to prompting the models.
But i don't mind people using AI it's their own choice, the focus then just becomes in the curation skill of the individual, team, company etc of the generated AI output. So taking away the award is kind of weak given people enjoyed the game.
> When it comes to AI im more of a luddite at the moment, things change like every 6 months when it comes to prompting the models. [...] So taking away the award is kind of weak given people enjoyed the game.
To nitpick: the independent game awards are the Luddites here. The Luddites were a protest movement, not just a group of people unfamiliar with technology.
In the historical context that's apparently become appropriate again, Luddites violently protested the disruptive introduction of new automation in the textile industry that they argued led to reduced wages, precarious employment, and de-skilling.
> "Generating placeholder assets is completely acceptable, etc."
Not if it's against the rule. They got caught with skidmarks. And while the "Ackshually, those skidmarks are just placeholders"-defense may elicit a few cheap laughs, it doesn't matter if you follow the rule to its logical conclusion. Any possible deception in such cases comes on top of it. As it always has; that doesn't change just because you found a new plaything (LLMs) in the box.
This isn't even an indie game, with funding rivaling major studios, what are we doing?
"While the assets in question were patched out, it still goes against the regulations we have in place. As a result, the IGAs nomination committee has agreed to officially retract both the Debut Game and Game of the Year awards"
Looks like "regulations nitpicking". In the end it doesn't represent the players best interests.
I just think it's fun to watch "artists" seethe.
>Of course I think we should study art! Why yes of course I studied the greats to hone my skills, sometimes even copying their work directly to strengthen a specific skill set!
>Ai "studies" prior works to hone it's skillset...
>No! Not like that!
To help prevent confusion: Clair Obcur was not stripped of its record-breaking 9 awards at the Game Awards.
The Indie Game Awards, despite sounding similar to The Game Awards, is an unrelated organization that holds their awards the same week. They are small and this is their second year.
As an indie game developer the idiots who made this decision do not represent us and are completely detached from actual game production over the last 3 years.
For those who might care, we use generative AI as much as possible in every way possible without compromising our vision, this includes sound, art, animation, and programming. These are often edited or entirely redone (effectively placeholders). It's part of the process, similar to using procedural art generation tools like geometry nodes in Blender or fluid sim particles generators.
And btw, both UE5 and Unity now have gen AI features (and addons) that all developers can and will use.
I wish they had CRUD application awards.
Rather than going into a huge rant about this, let me just give a quick anecdote.
It used to be there were tons of websites, like textures.com, which curated a huge database of textures, usable by art professionals and hobbyists alike. Some of it was free, others you had to pay for, both generally speaking, it wasn't too expensive, and if you picked up 3d modeling as a hobby, you could produce pretty decent results without spending a dime.
Then came the huge companies (you know which ones) which slurped up all these websites, and turned them into these SaaS monstrosities, with f2p mechanics. Textures were no longer free, but you had to pay in 'tokens' which you got from a subscription, which pushed you into opaque pricing models, bundling subscriptions, accidental yearly signups with cancellation fees, you know the drill.
Then came AI, which is somehow fair use, and instead of having to pay for that stuff, you could ask SD to generate a tiling rock texture for you.
Is this blatant copyrightwashing? I'd argue yes. But in this case, does copyright uphold any morally supportable princible, or does it help artists get paid?
F no.
People were against steam engines, tractors, CGI, self-checkouts, and now generative AI. After some initial outrage, it will be tightly integrated into society. Like how LLMs are already widely used to assist in coding.
Or not. Unlike all of the above, AI directly conflicts with the concept of intellectual property, which is backed by a much larger and more influential field.
I wonder what definition of AI they're using? If you go by the definition in some textbooks (e.g., the definition given in the widely used Russell and Norvig text), basically any code with branches in it counts as AI, and thus nearly any game with any procedurally generated content would run afoul of this AI art rule.
Their FAQ only states:
> Games developed using generative AI are strictly ineligible for nomination.
I haven't found anything more detailed than that; I'm not sure if anything more detailed actually exists, or needs to.
That's all I've found as well, but, personally, I find that a bit unclear, for a couple of reasons. First, are they saying that the game itself can use generative AI, but it can't be used in the development of the game? So that would mean that if the game itself generates random levels using a generative AI approach, that's allowed, but, if I were to use that same code to pre-generate and manually modify the levels, that wouldn't be allowed because I'm now using generative AI as part of the development process? I.e., I can create a game that itself is a generative AI, but I can't use that AI I've built as part of the development of a downstream game?
And, second, what counts as generative AI? A lot of people wouldn't include procedural generative techniques in that definition, but, AFAIK, there's no consensus on whether traditional procedural approaches should be described as "generative AI".
And a third thing is, if I use an IDE that has generative AI, even for something as simple as code completion, does that run afoul of the rule? So, if I used Visual Studio with its default IntelliCode settings, that's not allowed because it has a generative AI-based autocomplete?
> AFAIK, there's no consensus on whether traditional procedural approaches should be described as "generative AI"
Sure there is. "Generative AI" is just a marketing label applied to LLMs - intended specifically to muddy these particular waters, I might add.
No one is legitimately confused about the difference between hand-built procedural generation techniques, and LLMs.
That's not quite true though, right? Because diffusion models are also generative AI and they're not LLMs. Heck, they probably got disqualified, not for the use of an LLM, but for the use of a diffusion model.
So I think Gen AI is an umbrella. The question is, do older techniques like GANs fall under Gen AI? It's technically a generative technique that can upscale images, so it's generating those extra pixels, but I don't know if it counts.
There's not that much difference between diffusion models and other auto-regressive models (https://www.youtube.com/watch?v=zc5NTeJbk-k). But I'm of the opinion that Generative AI is a terrible umbrella term. It should include basically all of digital art if we take it seriously. The flood fill / paint bucket tool can be considered AI, any program using a search algorithm can be phrased in AI terms of a sense-think-act loop. Nevertheless I do understand what people tend to mean by it when they're raging. Right now it might best be defined in terms of workflow: a human uses natural language to describe what they want, and moments later a plausible image appears trying to match. This clearly separates it from every other tool in the digital artist's program, even many which one could arguably call generative AI. It also separates it from stock-photo/texture searches done externally to some art program, as those are done in a query language rather than natural language.
AI is a moving goalpost. At least now the moving goalpost is call AGI.
A bunch of 'if' is an "expert system", but I'm old enough to remember when that was groundbreaking AI.
Oh its AI that makes not indie, not the huge funding.
Should indie games have a maximum budget?
Yes. If you get 120M+ in funding, you no longer qualify to be called indie.
So what would the limit be in your mind? Does it include marketing?
regardless of the definition of the word. i don't think anybody would call a game indie that has tens of millions in production budget, over 300 developer working on it and a movie deal before it was even released.
So under ten million max then?
What's the maximum developer count? Do outsourced assets count, if so, how? By the amount of people who directly worked on the assets by the outsource company or the whole headcount?
All press is good press.
Few care about the mainstream game review sites or oddball game award shows as their track record is terrible (Concord reviews).
Most go by player reviews, word of mouth, and social media.
Great opportunity for a new award body that allows AI use.
True. Especially indie game awards. That have the least resources available and most like would benefit most from some use of AI. At that scale often even reasonably paid game developers are expensive.
I hear FIFA makes new awards these days
Just to be clear, it's some Indie Game awards, not the main The Game Awards
That’s not how awards work. Awards trade on prestige. In order for an award to matter, the people you’re giving it to have to care.
I think you’ll find most of the small teams making popular indie video games aren’t going to be interested in winning a pro-AI award.
> In order for an award to matter, the people you’re giving it to have to care.
Are you sure? Maybe not in gaming, but I'm sure most large companies create awards just to get them and mention them in marketing.
I wouldn't be surprised if the likes of EA and Ubisoft create a "best use of AI in gaming" award for next year.
deserved it wasn't even indie either
Why is usage of AI even a discussion point? Steam also now enforces publishers to disclose if they used AI during game creation. It is a tool, and as a consumer I judge the end product. I don't care what tools were used in the production, just as I don't care if you use Photoshop, Pixelmator, Maya, 3DSMax or whatnot. The end result is what counts. And if the end result is full of bullshit AI slop and is not fun to play, don't give them an award. I played Claire Obscure and it is an absolut stunning and beautiful game.
These things will keep happening and the bar to be against certain use cases of AI will shift gradually over time.
Before we know it we will have entrusted a lot to AI and that can be both a good or a bad thing. The acceleration of development will be amazing. We could be well on our way to expand into the universe.
LOL this is beyond idiotic. Banning AI-generated assets from being used in the game is a red line we could at least debate.
But banning using AI at all while developing the game is... obviously insane on its face. It's literally equivalent to saying "you may not use Photoshop while developing your game" or "you may not use VS Code or Zed or Cursor or Windsurf or Jetbrains while developing your game" or "you may not have a smartphone while developing your game".
> you may not use VS Code or Zed or Cursor or Windsurf or Jetbrains while developing your game
Just to nitpick, AAA game developers probably don't use the editors you mentioned since they do native applications.
What do you think they use?
I work in mobile games and practically everyone is using either VSCode or some Jetbrains IDE. A few use Visual Studio but it has AI autocomplete too.
Oh, mobile. I was thinking of the AAA desktop titles. I'm too poor to play free to play games so the mobile titles are kinda out of my mind.
Of course you'd use Jetbrains for Android...
Unity is by far the biggest platform for mobile games and its C# all the way, both for Android and iOS.
You need to crack open XCode only for very specific debugging tasks
Interesting. I don't do games, so I may be wrong, but I thought a lot of Unreal Engine devs used Jetbrains. So what editors do they use? Are there current IDEs or code editors shipping in 2025 that don't have any LLM-based coding features?
I could be wrong, but I'm not aware of any native debugging support except in visual studio... the native one.
For C# Rider has had superior tooling for debugging for years now.
OK, maybe my point got lost because I didn't know that, but I should have just added Visual Studio to my list — it too has LLM and agentic features, which was my point.
If you can't use LLMs to generate placeholder graphics that don't ship in the actual game, then why can you use coding editors that let you use LLMs to generate code?
If LLMs were simply a niche but somewhat useful technology people could choose to use or avoid, then sure, such an absolutist stance seems excessive. But this technology is being aggressively pushed into every aspect of our lives and integrated into society so deeply that it can't be avoided, and companies are pushing AI-first and AI-only strategies with the express goal of undermining and replacing artists (and eventually programmers) with low quality generic imitations of their work with models trained on stolen data.
To give even an inch under these circumstances seems like suicide. Every use of LLMs, however minor, is a concession to our destruction. It gives them money, it gives them power, it normalizes them and their influence.
I find the technology fascinating. I can think of numerous use cases I'd like to explore. It is useful and it can provide value. Unfortunately it's been deployed and weaponized against us in a way that makes it unacceptable under any circumstances. The tech bros and oligarchs have poisoned the well.
I mean, I share some of the concerns you expressed, but at the same time there is no chance at all that working programmers and artists won't be using LLMs (and whatever "AI" comes next).
I'm a programmer, and I enjoyed the sort of "craftsman" aspect of writing code, from the 1990s until... maybe last year. But it's over. Writing code manually is already the exception, not the rule. I am not an artist, and I also really do understand that artists have a more legitimate grievance (about stealing prior art) than we programmers do.
As a practical matter, though, that's irrelevant. I suspect being an "artist" working in games, movies, ads, etc will become much like coding already is: you produce some great work manually, as an example, and then tell the bots "Now do it like this ... all 100 of you."
> To give even an inch under these circumstances seems like suicide.
Seems like a histrionic take.
It’s like banning any and all uses of chainsaws for any kind of work ever just because some bros juggle with them and have chainsaw juggling conventions.
It’s just a tool, but like any tool it can be used the right way or wrong way. We as a society are still learning which is which.
More like banning automated forest clearing power chainsaw robots because they mowed down all the forests killing most life on earth. That's a terrible analogy, but better than yours.
indie game? with their budget and staff? really?
those guys worked in AAA studios and they got a 10 millions budget
how "indie" is that?
Indie Game Awards defines it like this:
"Existing outside of the traditional publisher system, a game crafted and released by developers who are not owned or financially controlled by a major AAA/AA publisher or corporation, allowing them to create in an unrestricted environment and fully swing for the fences in realizing their vision."
In other words, "indie" means a developer-driven game independent of the establishment. It doesn't necessarily imply a low budget or the lack of professional experience.
"indie" is just a marketing term now, it doesn't actually mean anything specific.
Ironic ally at some point it will just be anti-indy.
We should probably strip the Indie Game Awards of their awards show because the accountant used ChatGPT.
If they want to ban AI from their show that is their perogative, but considering that every nominee probably used AI somewhere (I'd bet money on this), this feels like blatantly dishonest posturing.
This is a travesty. Where winds meet uses LLMs, yet everyone seems to love that. Wtf.
More and more AAA games are going to have AI. Whether it’s AI content, AI dialogue, AI driven storytelling, or AI driven animation.
Having game of the year title stripped over some texture use is some next level petty BS.
This is so ridiculous that I suspect that it will be even better publicity for them than the award itself.
It's some random Indie award, not the main The Game Awards. Clair Obscur has enough publicity already and rightly so.
Dunno if they even care too much about that, the game is already a breakaway success.
It’s interesting, because we have examples of other sects in the past that also opposed human progress through technology. History is repeating itself.
For instance, see Luddites: https://en.wikipedia.org/wiki/Luddite
That does the Luddites a bit of a disservice:
> But the Luddites themselves “were totally fine with machines,” says Kevin Binfield, editor of the 2004 collection Writings of the Luddites. They confined their attacks to manufacturers who used machines in what they called “a fraudulent and deceitful manner” to get around standard labor practices. “They just wanted machines that made high-quality goods,” says Binfield, “and they wanted these machines to be run by workers who had gone through an apprenticeship and got paid decent wages. Those were their only concerns.”[1]
[1] https://www.smithsonianmag.com/history/what-the-luddites-rea...
In that case, the neo-Luddites are worse than the original Luddites, then? Since many are definitely not "totally fine with the machines", and definitely do not confine their attacks only on the manufacturers that go against worker rights, but they include the average person in their attacks. And the original Luddites already got a lot of hate for attempting to hold back progress.
I don't know about worse, but I think the situations are very similar. It's inaccurate to think the Luddites just hated technological advancement for the sake of it. They were happy to use machines; why wouldn't they be, if they had a back-breaking and monotonous job and the machine made it easier?
The issue is not the technology per se, it's how it's applied. If it eliminates vast swathes of jobs and drives wages down for those left, then people start to have a problem with it. That was true in the time of the Luddites and it's true today with AI.
I really like Neal Stephenson's neologism 'amistics' - referring to which technologies a culture knows about but chooses not to use.
It's unclear if Gen AI promotes any sort of human progress.
By all means, I use it. In some instances it is useful. I think it is mostly a technology that causes damages to humanity though. I just don't really care about it.
I think it's more the fact that they lied before nomination than the AI usage itself. Any institution is bound to disqualify a candidate if it discovers it was admitted on false grounds.
I wonder if the game directors had actually made their case beforehand, they would have perhaps been let to keep the award.
That said, the AI restriction itself is hilarious. Almost all games currently being made would have programmers using copilot, would they all be disqualified for it? Where does this arbitrary line start from?
> Almost all games currently being made would have programmers using copilot
I think that is almost certainly untrue, especially among indie games developers, who are often the most stringent critics of gen ai.
This is just one example, but today I found this where two people build their passion project using GenAI for image generation (+ photoshop), maybe otherwise this project wouldn't even be possible: https://reddit.com/comments/1prqfsu
Are you sure? A survey by the YouTuber Games And AI found that the vast majority of indie game developers are either using, or considering using AI. Like around 90%.
Only when it comes to graphics/art. When it comes to LLMs for code, many people do some amazing mental gymnastics to make it seem like the two are totally different, and one is good while the other is bad.
> Almost all games currently being made would have programmers using copilot
Which LLM told you that?
Please, LLM code assistants are ubiquitous enough nowadays with inline code suggestions in vscode on by default. It's an extremely safe claim.
That would imply the following to be true,
> Almost all games currently being made would have programmers using VSCode.
Which clearly isn't the case, unless they like to suffer in regards to the Unreal and Unity integrations.
> That said, the AI restriction itself is hilarious. Almost all games currently being made would have programmers using copilot, would they all be disqualified for it? Where does this arbitrary line start from?
AI OK: Code
AI Bad: Art, Music.
It's a double standard because people don't think of code as creative. They still think of us as monkeys banging on keyboards.
Fuck 'em. We can replace artists.
It is silly, considering there is obviously much higher chance that code-generating LLM generates copy of existing copyrighted code than image-generating diffusion model generates copy of existing copyrighted image.
You get why people hate AI when AI boosters talk like this, right?
> It's a double standard because people don't think of code as creative.
It's more like the code is the scaffolding and support, the art and experience is the core product. When you're watching a play you don't generally give a thought to the technical expertise that went into building the stage and the hall and its logistics, you are only there to appreciate the performance itself - even if said performance would have been impossible to deliver without the aforementioned factors.
I would disagree, code is as much the product in games as the assets.
Games always have their game engine touch and often for indie games it's a good part of the process. See for example Clair Obscur here which clearly has the UE5 caracter hair. It's what the game can and cannot do and shapes the experience.
Then the gameplay itself depend a lot on how the code was made and iterations on the code also shape the gameplay.
To further this: You can even feel the org structure in games.
- Final Fantasy 7 Rebirth clearly had two completely decoupled teams working on the main game and the open world design respectively
- Cyberpunk 2077 is filled with small shoeboxes of interactable content
Well that’s a rule that makes no sense.
These awards are behind the times and risk irrelevance.
What software in 2025 is written without AI help?
Every game released recently would have AI help.
> Every game released recently would have AI help.
For indie games in particular, that is very much not true. In fact, Steam has a 'made with AI' label, so it's not even true on that platform.
You think many are built without any assistance for coding? My impression was that people were mostly concerned about game assets like graphics and music
I think many are built without the use of gen ai to create assets. Obviously, the term "AI" is flexible enough that you could clarify every piece of software as involving AI if you wanted to, but I don't think that's productive.
I would assume that if a tool is there and the alternative too costly that they would use the tool instead of buring their project. Just today I stumbled over this for example, where they use GenAI as well: https://reddit.com/comments/1prqfsu
Do you have proof that many are using AI for coding?
Not for coding, but today I stumbled upon these two building their passion project using GenAI, which would otherwise perhaps not be possible: https://reddit.com/comments/1prqfsu
If you join a contest with a rule that doesn't make sense, lie about following the rule when you did not, and the contest disqualifies you, it's not the rule that's the problem, it's the lying about complying with the rule when you did not. Are you really so confused by this whole thign that you don't get that or are you just another Big Tech AI fanbro?
Excellent news.
A blanket ban is the way to go on this, people trying to muddy the waters professing they just have nuanced opinions know what they are doing... it's only a horse armour pack, it doesn't affect gameplay, you don't have to use it, you won't notice if it's not there...
After the huge impact on the PC gaming community, it's logical to despise AI and ban it from any awards. First cryptocurrencies pumped huge price raises on GPUs, then prices won't return to normal due to AI and now it's impacting RAM prices.
Next year a lot of families will struggle to buy a needed computer for their kids' school due to some multibillion techs going all-in.
I play games on cheap hardware. I would like awards to focus on the quality of the game, rather than how they were made.
Awards that focus on quality is too desired to not be a thing.
I expect generative AI to become a competitive advantage taken up by the vast majority.
Compute demand likely improved today's availability and performance of hardware.
This is a bit ironic in a way that video games in particular have been using "AI" since the beginning.
Correct decision. Skidmarks were detected to be present and it wasn't sabotage or somesuch. That means no award as skidmarks are considered so unprofessional at that particular award that, at this point, there's a rule against them.