Korean women remove pictures, videos from social media amid deepfake porn crisis

47 pointsposted 14 hours ago
by popcalc

107 Comments

BadHumans

13 hours ago

People on HN have a hard time understanding that the barrier to entry matters. Photoshop has existed for decades but the level of skill needed to make realistic looking nudes in Photoshop is not something your average freak with a fetish possesses. New technology puts that power into the hands of every single person.

adventured

13 hours ago

It's not your average freak with a fetish that is even the primary issue. The volume will come from average anybody (although mostly men). That's the real problem. Random co-workers in the office that want to deepfake their crush. Your average teenage boy with a decent GPU that wants to deepfake their classmates just to experiment. There have already been several instances of this catching the news in Europe for example, so it's obviously going on globally.

Oh but the average guy isn't that perverted? Yes they are. We know this from the vast amount of porn they consume and of what type they consume.

s1artibartfast

13 hours ago

Seems like revenge porn and sexual harassment should already cover the use case of distribution.

What perverted things someone does with photos in the privacy of their own home seems like it should be their own business to me.

ultimafan

12 hours ago

Actually nailing someone for revenge porn or sexual harassment seems like a borderline impossible task for police departments if the person doing it has even a minor degree of anonymity. Enlarging the potential pool of people who can do this and the pool of victims they can target by an order of magnitude by lowering both the skill it took to make convincing fakes before and the time taken to make each fake doesn't seem to me like a good idea.

techjamie

7 hours ago

> New technology puts that power into the hands of every single person.

I think the bar is roughly the same, all things considered. If you're going to photoshop, then it's largely a matter of combining two images with a sufficient amount of masking and blurring to make it appear good enough.

With image generators, you've got a couple different options. The more trivial would be doing img2img on your target, the more complex being making a LORA.

Knowing how tech savvy the average Andy is not, I would say anyone capable of going through the steps to do the AI method could, with some brute-forcing, also manage the Photoshop method. Setting up a local instance of any image generator is beyond the majority of the population, so the closest most can get is paid online GUIs that offer an existing service that they first must become moderately proficient in for good results.

Only real difference is that anyone who can do it is much more capable of scale over time than previously. With enough Comfy nodes, you could probably even automate it given face detection and arbitrary images.

piva00

4 hours ago

The bar is definitely not the same, editing a photo on PS/other photo editors by making a collage realistic enough to cross the uncanny valley is not simply combining two images and doing some blurring and adjusting, even less if pictures are warped from different perspectives, lenses, etc.

Doing that on an image generator is trivial, there are whole websites with the sole purpose of inputting a seed image to be transformed through a prompt, that's a much lower barrier of entry for someone trying to make a fake nude. And it scales, one can do this in a batch in a few minutes/hours, doing that manually on a photo editor requires a lot more effort and knowledge.

SmitheryB

13 hours ago

I feel as though I'm the only one who doesn't care about this.

Curious to know if 20-30 years ago, as graphic design tools got better, were people freaking out about being able to superimpose another person's face on a the body of p*rnstar?

Superimposing people's faces on to porn star's bodies happened/s, but does anyone care really even notice or care about it now days? For the most part, of course not.

So why should we care about porn videos? What makes them so different? The whole concern smells like typical media hype (i.e. yay - we get to write about AI + Porn - instant clickbait) that nobody will be able to stop and nobody will even care about in 1-2 years: e.g. "oh, yah, another deepfake of <(politician|actress|colleague^)> in a porn film - how incredibly creative".

^ Incidentally, this was made light of in the British version of The Office where a modified image of the boss, David Brent, was sent around the office. If it's joked about harmlessly on mainstream TV, it's probably not a huge concern.

colonelspace

13 hours ago

It's more problematic than photoshopping a person's face was decades ago because we didn't have pervasive online social media decades ago. Computers weren't in your pocket, and an offending image couldn't be beamed to everyone you know instantly.

> If it's joked about harmlessly on mainstream TV, it's probably not a huge concern.

There are plenty of mainstream TV jokes about horrendous things. Our ability to joke about something is a poor way to gauge how concerned we should be.

SmitheryB

13 hours ago

> an offending image couldn't be beamed to everyone you know instantly

Is your concern that recipients would think it's real? Or is it that people will get an email/chat message containing NSFW material? (because the latter happens now, and is nothing more than background noise at most). What is your actual, specific concern? (genuinely, I have no idea, but want to try to understand - to me it feels like panic about a problem that isn't actually a real problem).

colonelspace

13 hours ago

I'm not sure what my personal concerns are specifically, so I don't have a good answer to that question.

But I think there's a difference between a crud photoshop picture that a person has on their PC, and a (more or less) convincing deepfake video of a person being broadcast.

I think you're likely right that there is an element of panic. I don't know a lot about Korean culture, but I imagine the concern is that deepfakes can have real-life negative impact on people's lives, either because people think it's real or because it creates a hyper-sexualised social environment that endangers women. Both are worthy of concern in my view.

SmitheryB

13 hours ago

Photoshop has for decades been able to produce fake images indistinguishable from real life.

There's still no case for why fake video is apparently so much worse than fake images. Both achieve the same effects when sent maliciously to a relative or contacts list.

I'll (try to) steelman the case for why video's worse: because people don't realise how easy fake videos are to make. But in 1-2 years... meh... most of society will have it figured out.

colonelspace

12 hours ago

But who gives a fuck about people who's lives are ruined in the interim 1-2 years, right?

ssalka

14 hours ago

I could very well be wrong about this, and usually I fall more on the e/acc side, but I think deepfakes are perhaps the gravest immediate concern of AI. Yes, they've been around for well over a decade, but never before has their creation (& thus volume) been so accessible to the average layperson. A simple 10 minutes on CivitAI will tell you all you need to know about how prolific this problem is.

IMO it is only a matter of time before this same issue comes to the forefront in the US (I'm kinda surprised it hasn't already become a bigger topic). I could easily see custom celeb LORAs/emmbeddings and realistic SD/SDXL checkpoints being banned outright for this reason. Yes, it would be an infringement on free speech, but weighed against the interests & safety of the public, the courts might see a move like that as reasonable.

comboy

13 hours ago

It's ban photoshop all over again.

It's just the new reality. Porn is not even the worst thing. If anything it can make real photos/videos harder to get viral.

But augmentation will be (if not already is) used to make very tiny adjustments, like you make one candidate/celebrity face slightly more symmetrical and the other one less, change their gait just a little bit. These changes are non-perceptible, are not interesting enough to make a scandal, and they can have huge impact on how people perceive other people. I mean, I am not suggesting that it isn't a puppet show without those, but the game is evolving.

JumpCrisscross

13 hours ago

> ban photoshop all over again. It's just the new reality.

We didn't solve teens doctoring inappropriate images of their classmates by banning Photoshop or accepting it as the new normal. We solved it by passing laws which criminalise the behaviour and enforcing them to deter the behaviour.

Technology influences society. But plenty of social mores are almost universally respected despite being difficult to enforce. Possible and difficult to enforce doesn't force permissibility.

gpm

13 hours ago

I was a teen post photoshop and it was just... never an issue. It never occured to me. No one ever mentioned it to me as a possibility. If someone did it would have struck me as weird and as way too much work. Photography class in school (in which we learned to use photoshop) didn't mention any acceptable use guidelines or anything. As far as I know no one in my school ever did anything distasteful with photoshop.

I don't accept the idea that "passing laws and enforcing them to deter behavior" was the cause of the lack of issues, because to be a deterrent us teens would have had to been aware of the laws.

JumpCrisscross

13 hours ago

> was a teen post photoshop and it was just... never an issue. It never occured to me

Same. And same.

Drawing inappropriate pictures of your classmates was creepy before. Making Photoshop or AI porn remains so now. Most people won't do it. But there is active social reinforcement that keeps it from becoming "the new reality."

> don't accept the idea that "passing laws and enforcing them to deter behavior" was the cause of the lack of issue

Fair enough.

The jargon was cyberbullying, and there was absolutely legislative activity around horrific examples. But the principal mechanism of action probably wasn't a direct fear of getting caught and punished, but the prompted discussion reinforcing and extending the norm.

s1artibartfast

12 hours ago

Did we criminalize doctoring a photo and make it a crime?

I thought we have general laws around sexual harassment and revenge porn which criminalize the point of harm.

I bring this up because most of the enforcement difficulty seems to stem from a desire to to control behavior which would otherwise be undetectable.

I believe we already have means of enforcement for students sharing pornography of their classmates.

Manuel_D

12 hours ago

Harassment is a lot narrower than people tend to think. It restricts people from makings communications that the subject does not want to be a recipient of. It does not restrict people communicating between each other about the subject. Abby following Jane around calling her fat is harassment if it is severe and pervasive. Abby and Betty texting each other, and talking to each other about how Jane is fat is not.

At least as far as criminal harassment goes. Schools can adopt more restrictive policies but the consequences would only be suspension, expulsion, etc. not criminal charges.

JumpCrisscross

12 hours ago

> Abby and Betty texting each other, and talking to each other about how Jane is fat is not

What if they do so publicly?

Manuel_D

11 hours ago

The takeway is that harassment laws restrict what people are saying to you, not what they're saying about you among each other.

If the police actually tried to prosecute them for harassment, the question the courts would be asking is, "could Jane reasonably avoid having to hear it?" Harassment is about being able to not be the recipient of communications you don't want to receive. Calling Jane fat on message boards, twitter, in a bar, etc. would all be fair game.

There's more restrictions if Jane is a captive audience. The courts have ruled that certain settings where people can't easily just leave - namely work and school - have greater restrictions. Again, the question the court would be asking is "can Jane reasonably avoid this speech" and it's a much bigger barrier to leave one's workplace than walk out of the bar.

user

12 hours ago

[deleted]

101011

13 hours ago

Laws can help, but I'm most concerned with states leveraging this to influence foreign campaigns.

There will be a time when the Trump "grab them by the you know what" style scandal will just be met with sceptism.

Where do you go when even video evidence can't be seen as the truth?

mingus88

13 hours ago

The devices need to cryptographically sign the original video with keys stored in some kind of Secure Enclave

If the video is edited or comes from a source that doesn’t have trusted keys then it’s no better than some random tweet

colonelspace

12 hours ago

This is just a certificate of authenticity that one either trusts or not, based on whatever one knows about secure enclaves, cryptography, or the device/person issuing the key/certificate.

I think the reality is that photographs and video are now like text, you trust the source or not.

I can write about Cleopatra's ability to snowboard and play Xbox, or I can make a photo with the caption "Cleopatra doing a 1080", and now I can make a video of it (probably).

I can also give you a certificate/key/blockchain whatever to prove it's "valid".

user

13 hours ago

[deleted]

yjftsjthsd-h

11 hours ago

And then someone takes the camera and points it at a screen showing an image of their choice. The analog hole cuts both ways.

user

13 hours ago

[deleted]

0x000xca0xfe

13 hours ago

Photoshop was never built as an easy tool so powerful that even grandma can produce photo-realistic fake images in seconds. The new wave of AI powered editing tools are clearly on a new level.

This is like bringing your howitzer to a gun debate because they are kinda similar.

comboy

13 hours ago

Generative AI is much easier than photoshop the same way that photoshop was much easier than trying to alter photos any other way. In both cases it is a completely new level.

JumpCrisscross

13 hours ago

> new wave of AI powered editing tools are clearly on a new level

For the time being, the easiest-to-use tools are centrally run. That makes monitoring and enforcement easier.

ssalka

13 hours ago

A centrally-managed app will always be easier to use than a local self-service tool. But those local apps will keep trying to converge towards the ease of the hosted apps. Over time I expect the difference will become negligible (i.e. a 1-click install and it actually works)

ultimafan

13 hours ago

Photoshopping something convincing has a much bigger bar to entry than (as far as I understand it) generation of deepfakes now. Lots of time/practice invested even for one convincing image.

From what I can see the issue here is that this problem with AI can potentially become much much more widespread because the amount of work required is pretty much none and gets lower as models get better. It sounds like someone using deepfakes to ie extort people can just cast a much wider net by just scraping/dumping masses of images into a generator to create a multitude of convincing pictures of many different people instead of having to focus their time and effort on picking a single target and creating fakes manually.

RajT88

11 hours ago

Probably half the shorts on Facebook are pretending not to be AI generated now. And a similar number of the photos in your feed advertising groups.

gjsman-1000

13 hours ago

The above sounds like the opinion of someone who doesn’t have a daughter.

giraffe_lady

13 hours ago

Even trying to be helpful here you're still just centering men's feelings about "their" women.

gjsman-1000

12 hours ago

Do you seriously think most women are on board with this?

Many do not know it’s happening. Those that do and aren’t chronically online find it horrifying and even traumatizing.

Manuel_D

12 hours ago

I think most women aren't on board with people drawing erotic pictures of them with pen and paper either. I think most women aren't on board with their peers fantasizing about them - even if it's completely limited to their imagination.

giraffe_lady

12 hours ago

No believe me I know, I have heard women talking about this coming for years. It was already a very common harassment technique before AI made it free, easy, and accessible.

You can tell how few women are involved at the key stages of building new tech because of how rarely obvious harassment vectors are accounted for during development.

The problem I have is with trying to get men to care about it because it affects their daughter or whatever.

ultimafan

12 hours ago

>You can tell how few women are involved at the key stages of building new tech because of how rarely obvious harassment vectors are accounted for during development

Interesting point to think about- it certainly would have helped. But I also genuinely don't see how someone developing this kind of technology, who has had even a brief glimpse of online culture any point in the last 20 years, didn't immediately see what it would be used for. It almost feels like "progress" with malicious intent. Or an extreme case of apathy towards societal consequences.

>The problem I have is with trying to get men to care about it because it affects their daughter or whatever.

Lot of people are too self centered to care about any issue unless it directly impacts them. If it gets people thinking about how it affects their loved ones I don't think that's necessarily a bad thing.

giraffe_lady

11 hours ago

> If it gets people thinking about how it affects their loved ones I don't think that's necessarily a bad thing.

It's just shitty both to men and women. Men don't care about an issue unless it affects their daughters? They don't have other women they care about? Mothers, friends, mentors? Or for that matter sons who are also vulnerable to abuse and manipulation? The men I know are better than this.

And it positions women as being just a motivation for the actions of men, whose right relationship is to be protected rather than being actors in their own right. If you find this tactic sound then feel free to use it but I think there are better ways.

ultimafan

11 hours ago

I get where you are coming from. The original poster could have phrased it better.

But it's an issue that has both then and now predominantly affected mostly women. Sadly I reckon many men will not care about it, or even be aware of the issue, until it impacts a woman they know.

gjsman-1000

12 hours ago

> The problem I have is with trying to get men to care about it because it affects their daughter or whatever.

Sorry, this is a retarded point of view. You literally are saying you have a problem with people taking action against nonconsensual digital pornography, because it affects and angers people they love.

giraffe_lady

12 hours ago

I do have a TBI but I don't see what that has to do with anything.

gjsman-1000

12 hours ago

A. Idea, not person.

B. To restate, heaven forbid a man take action based on what people he loves would want. All motives must be purely self-satisfying, or purely driven by the affected.

slibhb

13 hours ago

I don't think of myself as on the e/acc side, but I think that deepfakes are here to stay and we'll just have to get used to them. I don't see any way to put the genie back in the bottle but I also don't think it'll take that long to adapt.

whiplash451

13 hours ago

This is giving up a little quickly on watermarking image generation.

I do think it’s a hard problem (especially if we want robustness to compression etc) but amenable to research.

transformi

14 hours ago

Banning Loras/Embedding won't help that..cause you can generate realistic images with only one reference photo these days...

ssalka

14 hours ago

Fair point, technologists will always innovate faster than politicians are able to regulate. Cat's outta the bag now, but that only means the problem will continue to escalate until more drastic measures are deemed necessary.

It's that or we will come to accept some level of societal degradation due to the abuse/misuse of AI tools becoming normalized.

user

13 hours ago

[deleted]

user

14 hours ago

[deleted]

m3kw9

13 hours ago

It won’t be a problem if it gets saturated, people bored of things easily

ssalka

13 hours ago

As AI image generation (and soon video generation) gets to be a more saturated space, we will end up in the state that you cannot trust anything you see online, even if you are educated, well-informed, and meticulously research the topic.

It has already been inching that way for a long time, but generative AI definitely is adding new fuel to the fire.

ryandrake

13 hours ago

I mean, you already kind of cannot trust anything you see online, it's just that so many people still haven't got the message. Maybe everyone needs a deepfake porno of themselves sent to them so they finally grok that everything online should be by default considered fake.

mingus88

13 hours ago

Ah yes, “waiting for the sexual abuser to get bored” sounds like a good tactic and very comforting for young girls

m3kw9

11 hours ago

A head attached to a fake body as prevalent as a click, nobody cares. 2 years back it would be, but like I said nobody will gaf once every one and their mom can have their heads attached to any body

pessimizer

13 hours ago

> I could very well be wrong about this, and usually I fall more on the e/acc side, but I think deepfakes are perhaps the gravest immediate concern of AI.

I don't think they're a concern at all, other than that they will ruin the evidentiary value of photographs. They'll do that whether or not the general public has access to them.

The concern about fake pictures of people naked is stupid. The same process mentioned above happens to naked pictures of people: they lose evidentiary value. General access to the ability to make deepfakes not only makes it so people won't believe what they see now, but also won't believe actual revenge porn. It simply ceases to be a problem. You can claim it's a fake, and no one can disprove it. This is a reason for celebration.

The standard response for that has been for people to simply claim that it won't matter, and that people will destroy the life of some young innocent girl if a mean boy spreads fake pictures of her naked. The answer is that people who harass a young girl are guilty of harassment, and should be jailed. You shouldn't be harassing people even if naked pictures of them are real, which they often are. And if you'd attack a person over pictures that you could fake up yourself in five minutes, you're either a) a moron or b) somebody using the photos as an excuse, and it would be a benefit for society for you to be locked away from it.

I suspect a lot of people really worried about deepfake porn deep down are worried about their ability to properly identify and harass the harlots if people won't believe the pictures that you send to their family and friends anymore. From my perspective, they're literally fighting to preserve revenge porn.

acdha

12 hours ago

> The concern about fake pictures of people naked is stupid. The same process mentioned above happens to naked pictures of people: they lose evidentiary value.

This is too narrow: it presumes rational contexts where evidence will be evaluated critically, and it assumes a mild, uncreative attacker but harassers are good at exploiting social norms. The damage caused by this kind of harassment isn’t seeing yourself nude as much as wondering or knowing what other people will think, and there’s no way to undo that. Even if people _say_ they believe that the pictures are fake, the victim may be wondering whether that’s really true for the rest of their life.

The other thing to think about is that we’re really just getting started. Most of the deepfakes which the average person has seen are fairly mainstream porn nudes: there’s a huge control issue but in general there are worse things than people thinking you have a star’s body. We’re going to see those worse things soon: it won’t be a porn fantasy but things designed to degrade and more targeted - if the target had body issues, exaggerating them; punishing the person who rejected someone by faking them looking not like a porn star but less attractive and having degrading sex with someone unpopular, knowing that a bunch of people are going to think “well, he would say that”; make some teacher’s life hell by faking them having sex with a student; etc.

Because the attacks are being directed by humans, they’re going to adjust and find what works. Anything which forces an official investigation is going to be especially bad because we can’t just ignore things like child abuse so it’s going to be like SWATing where there’s an ugly period where the wrong teenager can get an incredibly high-leverage response with a single message.

Manuel_D

12 hours ago

> make some teacher’s life hell by faking them having sex with a student;

The harm comes from falsely accusing someone of rape, not the deepfake itself. This can be done with or without a deepfake.

As a counterpoint: if a group of boys find their teacher is really hot and they use Stable Diffusing inpaint her clothes to create a fictional naked photo, where's the harm? They could also just use their imaginations to fantasize about what she'd look naked. I don't really see that much different - they're essentially delegating the imagination to the computer.

The typical response is that if the teacher discovers these photos, that discovery could be disturbing for her. I do agree. But at the same time, if these boys walk up to her and tell her "I just jacked off to fantasies of you last night", that's also probably disturbing too. Like the first scenario, the deepfake is tangential to the actual harm: the harm stems from telling another person that you've fantasized about them sexually - not the exact nature of whether that fantasy was produced by a human brain or a silicon one.

RajT88

11 hours ago

Women get fired from jobs for naked photos of them leaking - photos obtained through illicit means, photos taken consensually or even selfies not shared with anyone in particular.

As long as it looks convincing, when deepfakes start circulating like this, the object of the deepfake will face consequences far in excess of what you are suggesting. Hell, some parts of the country even obvious deepfakes may get a woman in trouble. Any woman, not just a teacher.

Manuel_D

10 hours ago

Again, the harm comes from dishonesty not the deepfake itself. Calling someone's employer and falsely accusing them of stealing from their past companies or abusing their co-workers achieves the same purpose. If a company fires an employee because of fictional content, the blame lies with the company.

acdha

8 hours ago

> The harm comes from falsely accusing someone of rape, not the deepfake itself. This can be done with or without a deepfake.

The person I was responding to said they were not “a concern at all”, but clearly that can’t be true if they make harassment and false accusations easier and more damaging. Similarly, we shouldn’t dismiss the impact of making those attacks stronger - it’s like saying that giving a kid a gun is no big deal because they used to have a slingshot.

You missed the reason why I mentioned that scenario. It’s not just that the targeted teacher gets harassed — and, let’s be honest, that’d be a lot of aggression just below the threshold of serious disciplinary measures – but that something like that would if found trigger a mandatory law enforcement response. School employees tend to be mandatory reporters so they don’t have the option of saying “that’s a fake, he’s probably lying”.

The real harm is that unless it’s a very obvious fake, the victim is going to have to go through a police investigation into their personal life. Want to bet that’d especially be used to target people who have reason not to trust the police because they’re gay, brown, not an evangelical Christian, etc.?

ultimafan

13 hours ago

I think the problem is that people aren't necessarily logical or rational about such things. I know someone (family friend) who was the victim of similar harassment with (clearly) photoshopped images being sent anonymously to her work, family, friends on social media. The fact that they were fake didn't diminish the humiliation or stress she went through and I don't think she ever really recovered from the trauma of having some anonymous asshole harassing and stalking her and her family online. It eventually stopped but she was afraid of it starting again for a long time. "It's fake" isn't much of a comfort to someone being harassed and being afraid of harassment escalating in one way or another. And it getting easier and easier for crazy people to do this with less and less effort doesn't help the issue at all regardless of what laws are passed. Finding better ways to deal with these people is a good thing but enabling them to make it easier isn't, in my opinion.

I think for a lot of people it's easy to die on the hill of technological progress and freedom at any cost when the consequences haven't happened to you or a loved one.

ryandrake

13 hours ago

We definitely need to crack down more on harassment and stalking and not worry so much about what form that harassment and stalking takes. It's insane that you can have someone all over you, sending you unwanted messages, following you, or generally targeting you with misery that falls short of physical assault, and the police won't do anything about it until you're physically hurt. It shouldn't matter if the harassment is phone calls at 4AM or deepfakes. Harassment is harassment.

ultimafan

12 hours ago

I agree. But the internet age feels like it unfortunately has made this much more difficult than it should be. People over share on social media and make themselves too visible because they aren't aware of the consequences- or don't assume there are people out there sociopathic enough to harass someone to that extent. The pool of potential suspects when they're acting anonymously has grown from whoever is in your local community to whoever has an internet connection in the world. Not uncommon to hear of such harassment, blackmailing cases and such where the perpetrator doesn't even live in your own country where police could do anything about it even if they had their name and real evidence.

ssalka

13 hours ago

> ruin the evidentiary value of photographs

This is going to be a monumental issue when the fact that we can't trust any photographs fully takes hold. Maybe watermarking will help with that. More likely people will develop tools to circumvent watermarking technology (either on the imprinting side, or on the verification side).

> people won't believe what they see now, but also won't believe actual revenge porn. It simply ceases to be a problem.

This may be a comforting way to think about it, but it won't work in all cases. Consider instead of revenge porn, the case of CSAM – that will never be a solved problem until 100% of it is wiped from the face of the earth and it is never produced again. Making it less believable because there is so much AI-generated equivalents out there is a not a solution to the problem, it exacerbates the problem.

JumpCrisscross

13 hours ago

> when the fact that we can't trust any photographs fully takes hold

Full faith in photography was never a thing. If anything, the advent of photography came alongside yellow journalism. Trust is an institution. You can't build institution on solely technology.

ssalka

13 hours ago

Good context. But the fact is, a great many people – perhaps even the majority of people – are quite used to looking at a photograph and fully trusting that it depicts something which actually happened. As they say, seeing is believing.

Even with CGI being what it has been since the 70s, it's still appropriate to apply Occam's Razor in many cases, and to simply say that because it looks real, it probably is.

2OEH8eoCRo0

13 hours ago

> Yes, it would be an infringement on free speech

Freedom of speech is mainly so that we can criticize the government. How does deepfake porn or impersonation have anything to do with speech?

delichon

13 hours ago

I think freedom of speech is mainly about freedom of thought and just plain freedom, and the ability to criticize the government is a side benefit. And that it should also include the freedom to augment our imaginations with the prosthetics of our choice.

2OEH8eoCRo0

13 hours ago

But people have always been free to talk about the weather, the killer feature is being able to talk about your leaders.

giraffe_lady

13 hours ago

Cannot wait to augment the imaginations of his coworkers with a realistic video of this guy fucking a goat.

JumpCrisscross

13 hours ago

> Freedom of speech is mainly so that we can criticize the government

Freedom of speech is a bigger concept than the First Amendment. Technically curtailing one's ability to make AI porn of your colleagues is a restriction on speech. That said, so is every spam filter, fraud statute and retaliation against a declaration of war. A right doesn't have to be absolute to be good (or vice versa).

gpm

13 hours ago

> spam filter

Your other examples, yes. This one, not at all.

Freedom of speech is against government intervention, not against private agreements.

Rather in fact an absolute form of freedom of speech would protect against the government from stopping you from forwarding all your correspondence to someone else, and would protect against the government stopping that someone else from removing the spam and sending the rest back to you. Non-government run spam filters aren't just not offending freedom of speech, they are protected by it.

JumpCrisscross

13 hours ago

> Freedom of speech is against government intervention, not against private agreements

You're conflating the First Amendment and freedom of speech [1]. If you're in my house and I prohibit you from expressing certain views, that is fine by the First Amendment but not in the spirit of free speech.

[1] https://en.wikipedia.org/wiki/Freedom_of_speech

gpm

13 hours ago

No I'm not. You're conflating freedom of speech with a duty to listen. Freedom of speech entails a freedom to send spam. It doesn't entail a duty to receive spam, to read spam, to not have someone else filter your communications for spam, etc.

And as I argued above, since every step in having someone filter your communications for spam is simply speech, it protects spam filters.

JumpCrisscross

13 hours ago

> No I'm not

You restricted it to "government intervention, not...private agreements." The freedom of speech extends to the private sphere.

> You're conflating freedom of speech with a duty to listen

These are inextricably linked in the classical form. In the Athenian tradition, there was a duty to listen [1]. (Though even then practicality made it far from absolute.) That traces through to the Protestation of 1621 [2] and the Founders' debates around the First Amendment (namely, between Madison and Hamilton).

Practically, of course, no duty is absolute. But a society where everyone screams into a void isn't adhering to the principles of free speech and expression. (I'll note that alongside the duty to listen is an implied duty on the part of the speaker to speak truly [3]. Spam is the epitome of the cessation of the latter revoking the former.)

[1] https://www.stoa.org/demos/article_democracy_overview@page=a...

[2] https://en.wikipedia.org/wiki/Protestation_of_1621

[3] https://en.wikipedia.org/wiki/Parrhesia

gpm

12 hours ago

I think you're confusing something along the lines of "parliamentary privilege" for freedom of speech. Various forms of government have imposed as part of their decision making process a duty to listen to various forms of official speech. That's not a component of freedom of speech though, rather it's a component of that specific form of governments process.

I'm not sure where you find a duty to listen in Athens, my best guess is you are pointing to the duty to attend the assembly, which fits this form of "parliamentary privilege" rather than a broader duty to listen. To contrast, by my understanding political commentary in the form of plays and the like were quite common in Athenian culture, and I'm fairly sure there was absolutely no duty to pay attention to those. I'd also note that even with the assembly "the crowd might raise a clamor and refuse to listen to a speaker advocate an unpopular proposal" (your source) which suggests little duty to actively listen even there.

The protestation of 1621 is directly related to parliamentary privilege (and why I labelled this the way I did, though I don't think the label fits perfectly, since it applies to many smaller government meetings than parliament).

I can't say I'm familiar with the first amendment debate you are discussing.

> You restricted it to "government intervention, not...private agreements."

I don't think this is the cause of our disagreement, but I'll grant you that that phrase was slightly inartful. E.g. there are authorities other than governments, like a local gang, who could impinge on freedom of speech with threats of retaliation, and to the extent that private agreements are enforced by an authority it's possible for them to do so.

JumpCrisscross

12 hours ago

> you're confusing something along the lines of "parliamentary privilege" for freedom of speech

Technically, parliamentary privilege derives from the Athenian model of freedom of expression. In the case of the latter, the privilege (and duty) extended to the whole demos. Politics were never “off.” (Well, unless you were a child. Or a woman. Or a slave. Or a foreigner.)

One can trace it from there to the Roman Republic, and most pointedly, from folks in the early Empire lamenting how private society proactively narrowed the Overton window to align with perceptions of Imperial preference. (The Senate floor was muzzled by dissuading private criticism, officially and informally.)

Those “classics” then went to England and France, and then to America and Europe’s colonies, and then to the UN and EU. TL; DR Freedom of expression in Western thought as I understand it.

> which suggests little duty to actively listen even there

It was generally notable enough to be recorded, which suggests a general restraint. Not an absolute duty. But more than a little.

> don't think this is the cause of our disagreement

I don’t actually think we disagree :). We just know more about different aspects of the same thing.

2OEH8eoCRo0

13 hours ago

All speech is information but not all information is speech

JumpCrisscross

13 hours ago

> information != speech

Information intentionally communicated by a person to another is speech by the strictest classical definition. (I guess technically the creation of the AI porn isn't speech in the way a falling tree in a deserted forest doesn't make a perceived sound. But it's a schoolyard riddle for a reason.)

ssalka

13 hours ago

Freedom of speech is there to protect objectionable/offensive speech, whether it is directed towards the government or otherwise. Objectionable speech is frequently in the crosshairs of pro-censorship folks, and many people think of deepfakes as objectionable – reasonably so.

ChrisArchitect

12 hours ago

Subheading on this describes the story a bit better:

South Korea’s national assembly on Thursday passed a Bill to punish people possessing, purchasing, saving or viewing deepfake sexual materials and other fabricated videos.

oska

5 hours ago

Laws that prosecute ppl for possessing, downloading and storing computer files are deeply problematic, in my opinion. Whatever the content of those files.

(Publication of the contents of the files is another matter. I appreciate how this can be cause for prosecution.)

BurningFrog

13 hours ago

When I grew up "a photo never lies" was a rule you could rely on. Yes really!!

Photoshop and other advances long since killed that certainty, and now we have a healthy distrust of pictures we're not sure where they came from.

I expect videos will now walk a similar path, and life will go on with no major disasters.

singleshot_

13 hours ago

When I grew up, my friends dad had a picture taken of his British public school class taken circa 1955. In this photo, which was taken by a slowly rotating panorama camera, a lad started at the beginning of the photo, on the viewers left. Once the camera panned enough that he was out of frame, he sprinted behind the bleachers to the viewers far right, and took his place on the bleachers before the camera panned to take his picture for a second time.

Kindly, if you grew up after this picture was taken, and you believed the guideline “a photo never lies,” you were wrong.

falcolas

10 hours ago

> now we have a healthy distrust of pictures we're not sure where they came from.

Respectfully, we only wish this were the case. There are, right now, AI generated images and videos of a presidential candidate in a disaster area. People want to believe they are real. And so to millions of people, they are real.

People do the same with faked nudes. They will believe in those nudes because they believe that women are inherently sinful (huzzah for "original sin") and absolutely would do such things.

Oh, and Photoshop never had a dedicated app (or 100 dedicated apps) for creating nudes from regular photos and/or videos. It took a skill a scant fraction of a percent of people had.

colonelspace

13 hours ago

> When I grew up "a photo never lies" was a rule you could rely on.

It was a rule you thought you could rely on.

BurningFrog

11 hours ago

I should have remembered that I'm writing to an audience prone to passionate literalism...

I'm of course aware that there were forged photos back in the day. But they were rare and hard to make. They were usually paper only. Producing a fake negative was really hard, and a forensic expert could often spot it.

up-n-atom

14 hours ago

Makes sense and I can see this becoming a global trend if it's not already. But the irony of the news agency taking and posting their photo/video is a facepalm.

EDIT: I can foresee a dark future where surveillance video is monetized for such salacious purpose and it becomes inescapable as the cameras are everywhere.

danparsonson

13 hours ago

That is the dark present in South Korea as they have a huge problem with cameras hidden in hotel rooms and even toilets :-/

Simulacra

13 hours ago

Could be worse than that. With all of the facial recognition and videoing that's going on by the government, if the police wanted to, they could insert you into any footage committing any crimes they want.

frognumber

13 hours ago

I feel like this is a problem without a solution, or at least without a solution more expensive than the problem:

* I suspect that, like prohibition or DMCA, criminalizing behavior will lead to worse outcomes than accepting deepfake porn sometimes happens, at least if laws are passed targeting this in isolation.

* I suspect that before acceptance, many people will be harmed, on both sides. Humiliated teen girls. Immature hormone-driven teen boys.

* I suspect that we'll see a range of bigger problems such as real scams where, for example, a deepfake child calls their elderly parents and empties their bank account. Or, perhaps, a deepfake general calls someone in a nuclear bunker.

* Deepfakes are at the tip of technological problems which need solving. We have other things, like the cheap and easy availability of technology to make the next Covid-19. The point is we have the technology to make it in a lab, and the price point is going down by an almost Moore's Law like curve.

I don't see a good solution short of completely rethinking democracy and capitalism from scratch. The first time around took the Age of Enlightenment and the Age of Revolution.

It's perhaps time to do that.

Prove me wrong. Seriously. If there are incremental solutions which stop deepfake porn, deepfake scams, personal drone strikes, and super-viruses, please let me know what they are.

blackeyeblitzar

13 hours ago

I honestly don’t understand the problem with deepfake porn. It’s the same as a fantasy you might have in your head about someone you know. As long as it is not passed off as real, which might be defamatory, I don’t have a moral problem with it. My bigger concern is that a sudden spike in puritanism is going to be used to justify limitations on free speech - which is what a lot of AI regulation is.

thih9

13 hours ago

> As long as it is not passed off as real, which might be defamatory

That’s basically it. AI is extremely accessible and still relatively unknown. Virtually anyone can create a deepfake photo for free in seconds, post it anywhere and a lot of people will assume it is genuine.

Mistletoe

14 hours ago

I find deepfakes inherently non-sexy. The whole point of a nude or porn is to reveal something hidden accurately, what’s the point if it is just made up?

skocznymroczny

13 hours ago

Right now you can generate hundreds of realistic fake nudes of celebrity XYZ and no one cares. Celebrity XYZ accidentally leaks a nude photo from their phone and the entire world loses their mind.

I think after the initial moral panic and once the novelty wears off, this issue will fade into obscurity. And currently existing laws should cover sharing such imagery/videos anyway.

PlunderBunny

13 hours ago

I think the point is often to harm/exert control over the person depicted, e.g. when boys are making deepfake nudes of girls at their school, as happened in Spain and the USA recently.

bialpio

13 hours ago

What are your thoughts on hentai then? It's all made up after all.

Meta-point is that different people get off on different things, so using seemingly objective language ("the whole point of A is B") to describe something extremely subjective is imo counterproductive.

anigbrowl

13 hours ago

Some people don't care about the fakery and are happy with a low quality way to beat off.

Other people don't care because their goal is not to get aroused but to embarrass the persons whose real faces have been faked onto the AI-generated or AI-edited porno footage.

Some people say 'logically we all know it is fake so just ignore it'. This demonstrates something of a lack of imagination, an inability to picture oneself or a beloved family member being put in that position. It also overlooks that different societies have different social standards and that social mobs tend toward the lowest common IQ denominator.

South Korea is a very conservative society, whether you agree with this conservatism or not. Being in a deepfake is easily enough to ruin a person's social or professional reputation, and schoolchildren are especially vulnerable to such pressures. It's not so different from the US problem with sextortion, where teens are tricked into sending nudes to an apparent romantic prospect, and then blackmailed. In both countries, such abuse can end in suicide.

In even more conservative countries like Islamic societies, the victims of such fakes could conceivably suffer criminal penalties if authorities don't believe (or deliberately subvert) their protestations of innocence.

pessimizer

13 hours ago

> Being in a deepfake is easily enough to ruin a person's social or professional reputation

This is the problem with Korea, not deepfakes. That they have US-adjacent employment laws, and will fire people based on arbitrary anonymous bile. This shouldn't happen even when the photos are real. The idea that women who have been naked shouldn't work is repulsive.

anigbrowl

9 hours ago

Don't you think it's just a bit arrogant to say 'you should change your whole culture to accommodate this technology'? I don't care for SK's social conservatism either, but 'your society is wrong' is not a helpful response to people who don't want to be hassled by deepfake makers. Frankly I'm surprised that the conservative SK government is taking the issue seriously enough to legislate on it.