marcellus23
3 months ago
> And how do you stop people from making fetish content of purely AI-generated characters that aren't cameos of real people? Does OpenAI want to stop that? Maybe OpenAI thinks it's fine for people to make belly-flation or foot-fetish videos as long as they're not of a real person.
I can't figure out the tone here. Is the author suggesting we should stop people from creating fetish content of purely AI-generated characters? OpenAI might want to for business reasons, but surely there's nothing inherently wrong with using AI for fetish content. Should we also stop people from drawing fetish content with pencil and paper?
smcin
3 months ago
A good-faith reading of the sort of suggestion the author never made is for the blanket opt-in consent to allow other users generate images/videos of you to be segmented into separate consents for PG, adult, fetish etc.; also face/whole body. A very clear consent form that tells them upfront "If you consent to users generating fetish consent of you, here are some examples of what's allowed and forbidden".
> Is the author suggesting we should stop people from creating fetish content of purely AI-generated characters?
Presumably not, but she's farming outrage rather than suggesting any fix. In the above suggested setup, people could then generate fetish consent from the much smaller set of users who consented to have fetish consent generated from them. But then of course they might expect some royalties or revenue-sharing, or at least identification/attribution/watermarking so the depicted user could drive traffic to social-media. OpenAI is skirting around not just segmented consent but any concept of revenue-sharing (i.e. OpenAI wants to dip its toe into OnlyFans territory, but without any revenue-sharing or licensing deal with creators).
aleph_minus_one
3 months ago
> OpenAI is skirting around not just segmented consent but any concept of revenue-sharing (i.e. OpenAI wants to dip its toe into OnlyFans territory, but without any revenue-sharing or licensing deal with creators).
OpenAI is still doing basic experiments with which product offering are well received by users and/or work well and which are not. If some data provided by users (e.g. photos depicting the user) are seen to be very essential to the success of the AI-created content using this data so that OpenAI will likely loose an insane amount of money of these users leave (I think this is rather unlikely, but not impossible), then OpenAI will think about some concept of revenue-sharing, but not before (why should they?).
mslt
3 months ago
Pointing out that something feels a little creepy, while explicitly stating that you don’t yuck other people’s yum is hardly “farming outrage.” We collectively need to have earnest conversations about how how emerging technologies affect our experiences, and her tone is pretty middle-of-the-road.
smcin
3 months ago
Intentionally opting into an AI platform where we would already foresee a subset of users would start generating sexual content of other users without their knowledge, then professing to be surprised about it, seems contrived. OpenAI had just allowed porn and violence, and we already know a subset of their users will obsessively push the guardrails (plus there's a financial incentive to).
The only parts of the article that seem to be news are a) OpenAI's blanket consent is very broad and doesn't warn users what might be done with cameos, or segment consent into various different types of content use (as even 25yo modeling sites do) b) that subset of users will bypass the guardrails and c) OpenAI doesn't close the feedback loop by notifying the users in a) what the users in set b) are doing, let alone allow revising or revoking consent.
But why is the conversation only about b) (the predictable bad behavior by users) rather than a) and c) (feasible solutions)?
Notopoulos correctly remarks: "part of an overall pattern of how OpenAI has approached the concept of copyright and intellectual property: asking forgiveness, not permission."
By the way previous (non-sexual-content) incarnations of this sort of issue are 2019 when Clearview scraped 3 billion images non-consensually from people's social-media and state DMVs [https://news.ycombinator.com/item?id=35421117].
Or the previous 2024 OpenAI/Scarlett Johansson-sounding voice shenanigans. Or the existing proliferation of AI porn elsewhere.
quantified
3 months ago
Bow does other people doing things affect her, though?
TiredOfLife
3 months ago
> but surely there's nothing inherently wrong with using AI for fetish content.
It's taking work from Onlyfans, Artists who draw fetish content on commission (usually by copying other people art style), fanfic writers (who copy writing style, characters and setting of other people) and other organic and free range fetish content producers.
marcellus23
3 months ago
I see that position, but also, almost everyone (99% of people?) generating fetish content with AI would not otherwise be paying artists to commission that content.
casey2
3 months ago
Many girls believe that there is both inherent harm and external harm in the creation and consumption of sexual content. Arguing that AI content is inherently harmful is tricky, but can be replaced with yet more external harms.
whitexn--g28h
3 months ago
Many religious extremists, gender unrelated.
fwip
3 months ago
I don't think she's passing any judgment here - she pointed out earlier in the article that she knows people are into weird stuff, and didn't want to "yuck their yum."
Jackson__
3 months ago
I don't think one gets to call these people 'pervert' in the title, then claim that you don't 'yuck their yum'. They yucked their yum before the article even started.
regularfry
3 months ago
Titles frequently aren't in the control of the article writer. Don't know if that's the case here or not.
ivape
3 months ago
The problem is adults contribute to turning public platforms lewd. So one lewd person on Instagram leads to many, leading to a lewd platform. This becomes problematic when the children uptake it. It’s not really too different than prostitution appearing near general-purpose places, it turns it into a red light district.
A lot of social media is a sex platform, and it got mixed up in this way because there’s no talking adults out of being lewd in public.
tremon
3 months ago
Present-day adults have been raised in a society where overt sexualization has been the norm for decades. I see normalization of lewdness as a logical consequence of 60 years of sexualization in media, and the growing resistance against it as the normal swinging of the pendulum.
frumplestlatz
3 months ago
I wish we could talk adults out of it, but many decades on this earth have convinced me that’s just not going to happen.
mmooss
3 months ago
It's not easy or sure, but behaviors do change, dramatically sometimes. Youngish people have significantly less sex than before. People smoke a lot less in some countries, as one simple example.
Social media is still immature. We'll develop norms around what is appropriate.