Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

‘IRL Fakes:’ Where People Pay for AI-Generated Porn of Normal People

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

OozingPositron ,
@OozingPositron@feddit.cl avatar

You don't even need to pay, some people do it for free on /b/

guyrocket ,
@guyrocket@kbin.social avatar

This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me. The result is the same: fake porn/nudes.

And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

dysprosium ,

Exactly this. And rather believe cryptographically sighed images by comparing hashes with the one supplied by the owner. Then it's rather a question of trusting a specific source for a specific kind of content. A news photo of the war in Ukraine by the BBC? Check hash on their site. Their reputation is fini if a false image has been found.

T156 ,

At the same time, that does introduce an additional layer of work. Most people aren't going to do that just for the extra work that it would involve, in much the same way that people today won't track down an image back down to the original source, but usually just go by the one that they saw.

Especially for people who aren't so cryptographically or technologically inclined that they know what a hash is, where to find one, and how to compare it (without just opening them both and checking personally).

dysprosium ,

Sure but that's no problem if software would do that automatically for users of big (news) sites. Browsers on desktop and apps on phones.

kent_eh ,

People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me

Because now it's faster, can be generated in bulk and requires no skill from the person doing it.

Dkarma ,

Not relevant. Using someone's picture never ever required consent.

Bob_Robertson_IX ,

A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn't require any skills that a 1st grader doesn't have.

driving_crooner ,
@driving_crooner@lemmy.eco.br avatar

And it's looked as realistic as AI jobs?

nudnyekscentryk ,
@nudnyekscentryk@szmer.info avatar

But now they are photo realistic

ChexMax ,

Those are easily disproven. There's no way you think that's the same thing. If you can pull up the source photo and it's a clear match/copy for the fake it's easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material

Bob_Robertson_IX ,

This was before Google was a thing, much less reverse lookup with Google Images. The point I was making is that this kind of thing happened even before Photoshop. Photoshop made it look even more realistic. AI is the next step. And even the current AI abilities are nothing compared to what they are going to be even 6 months from now. Yes, this is a problem, but it has been a problem for a long time and anyone who has wanted to create fake nudes of someone has had the ability to easily do so for at least a generation now. We might be at the point now where if you want to make sure you don't have fake nudes created of you, then you don't have images of yourself published. However now that everyone has high quality cameras in their pockets, this won't 100% protect you.

ArmokGoB ,

I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.

0x0 ,

Those were the days...

KISSmyOS ,

This, but unironically.

Lucidlethargy ,
@Lucidlethargy@sh.itjust.works avatar
nednobbins ,

As much skill as a 9 year old and a 16 year old can muster?

https://en.wikipedia.org/wiki/Cottingley_Fairies

Vespair ,

no skill from the person doing it.

This feels entirely non-sequitur, to the point of damaging any point you're trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.

daddy32 ,

Scale.

echo64 ,

I hate this: "Just accept it women of the world, accept the abuse because it's the new normal" techbro logic so much. It's absolutely hateful towards women.

We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

AquaTofana ,

I don't know why you're being down voted. Sure, it's unfortunately been happening for a while, but we're just supposed to keep quiet about it and let it go?

I'm sorry, putting my face on a naked body that's not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it's actually believable because it's AI generated. That is SO much worse/psychologically damaging if they find out about it.

HubertManne ,
@HubertManne@kbin.social avatar

typical morning for me.

0x0 ,

Because gay porn is a myth I guess...

Jrockwar ,

And so is straight male-focused porn. We men seemingly are not attractive, other than for perfume ads. It's unbelievable gender roles are still so strongly coded in 20204. Women must be pretty, men must buy products where women look pretty in ads. Men don't look pretty and women don't buy products - they clean the house and care for the kids.

I'm aware of how much I'm extrapolating, but a lot of this is the subtext under "they'll make porn of your sisters and daughters" but leaving out of the thought train your good looking brother/son, when that'd be just as hurtful for them and yourself.

lud ,

Or your bad looking brother or the bad looking myself.

Imo people making ai fakes for themselves isn't the end of the world but the real problem is in distribution and blackmail.

You can get blackmailed no matter your gender and it will happen to both genders.

echo64 ,

Sorry if I didn't position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.

Thorny_Insight ,

Pointing out your sexism isn't saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.

echo64 ,

Yes yes, amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.

brbposting ,

It’s unacceptable.

We have legal and justice systems to deal with this.

For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

https://sh.itjust.works/pictrs/image/2afc9d80-2b27-4aa6-97f5-c845700b2cb7.jpeg

Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

https://sh.itjust.works/pictrs/image/be0d748e-a98d-4eaf-9a36-b21d51b1161e.jpeg

Telegram got right on it (not). Fuckers.

SharkAttak ,
@SharkAttak@kbin.social avatar

It's not normal but neither is new: you already could cut and glue your cousin's photo on a Playboy girl, or Photoshop the hot neighbour on Stallone's muscle body. Today is just easier.

echo64 ,

I don't care if it's not new, no one cares about how new it is.

cley_faye ,

How do you propose to deal with someone doing this on their computer, not posting them online, for their "enjoyment"? Mass global surveillance of all existing devices?

It's not a matter of willingly accepting it; it's a matter of looking at what can be done and what can not. Publishing fake porn, defaming people, and other similar actions are already (I hope… I am not a lawyer) illegal. Asking for the technology that exists, is available, will continue to grow, and can be used in a private setting with no witness to somehow "stop" because of a law is at best wishful thinking.

Ookami38 ,

There's nothing to be done, nor should be done, for anything someone individually creates, for their own individual use, never to see the light of day. Anything else is about one step removed from thought policing - afterall what's the difference between a personally created, private image and the thoughts on your brain?

The other side of that is, we have to have protection for people who this has or will be used against. Strict laws regarding posting or sharing material. Easy and fast removal of abusive material. Actual enforcement. I know we have these things in place already, but they need to be stronger and more robust. The one absolute truth with generative AI, versus Photoshop etc is that it's significantly faster and easier, thus there will likely be an uptick in this kind of material, thus the need for re-examining current laws.

Assman ,
@Assman@sh.itjust.works avatar

The same reason AR15 rifles are different than muskets

HubertManne ,
@HubertManne@kbin.social avatar

This is something I can't quite get through to my wife. She does not like that I dismiss things to some degree when it does not makes sense. We get into these convos where Im like I have serious doubts about this and she is like. Are you saying it did not happen and im like. no. It may have happened but not in quite the way they say or its being portrayed in a certain manner. Im still going to take video and photos for now as being likely true but I generally want to see it from independent sources. like different folks with their phones along with cctv of some kind and such.

Pretzilla ,

Ok so pay the dude $10 to put your wife's head on someone agreeing with you. Problem solved.

HubertManne ,
@HubertManne@kbin.social avatar

lol. there you go. hey you cheated on me. its in this news article right here.

roscoe ,

I didn't expect to get a laugh out of reading this discussion, thanks.

AstralPath ,

This kind of attitude toward non-consensual actions is what perpetuates them. Fuck that shit.

SharkAttak ,
@SharkAttak@kbin.social avatar

But I saw it on tee-vee!

EatATaco ,

The irony of parroting this mindless and empty talking point is probably lost on you.

SharkAttak ,
@SharkAttak@kbin.social avatar

God, do I really have to start putting the /jk or /s back, for those who don't get it like you??

EatATaco ,

Upgraded to "definitely."

SharkAttak ,
@SharkAttak@kbin.social avatar

Okay, okay, you won. Happy now? Now go.

EatATaco ,

Ok thanks

EatATaco ,

I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family's faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.

I agree there is nothing to be done, but it's painfully obvious to me that the scale and ease of it that makes it much more concerning.

T156 ,

Also the potential for automation/mass-production. Photoshop work still requires a person to sit down to do the actual photoshop. You can try to script things out, but it's hardly an easy affair.

By comparison, generative models are much more hands-free. Once you get the basics set up, you can just have it go, and churn things at rates well surpassing what a single human could reasonably do (if you have the computing power for it).

A_Very_Big_Fan ,

Why "AI" being involved matters is beyond me.

The AI hysteria is real, and clickbait is money.

General_Effort ,

Porn of Normal People

Why did they feel the need to add that "normal" to the headline?

sentient_loom ,
@sentient_loom@sh.itjust.works avatar

To differentiate from celebrities.

AdamEatsAss ,

This telegram user has a hard stance on "weirdos".

TheGrandNagus ,

Because it's different to somebody going online and finding a stock picture of Taylor Swift

yildolw ,

People who have Wikipedia articles have less of an expectation of privacy than normal people

echo64 ,

Every time this comes up, all the tech nerds here like to excuse it as fine and not a bad thing at all. I am hoping this won't happen this time, but knowing lemmys audience...

sbv ,

The Lemmy circlejerk is real, but excusing deep fake porn is pretty off brand for us. I'm glad the comments on this post are uniformly negative.

echo64 ,

https://sh.itjust.works/comment/10397565

https://kbin.social/m/technology@lemmy.world/t/927248/-/comment/5921190 just accept it as a new normal, it's fine. Can't possible have any recourse, just accept it women of the world, it's the new normal!

sbv ,

Okay, there are a couple of douche canoes, but generally speaking, I think we're okay on this one.

echo64 ,

It is massively upvoted (for lemmy).

Thorny_Insight ,

I'm not saying it's not a bad thing but it's inevitable. The problem will just be getting worse and there's no stopping it. It's something we're just going to need to accept as a new normal. If we can deal with living under the constant threat of nuclear armageddon then I think we can live with fake nudes aswell.

echo64 ,

Yeah it's this shit I'm talking about. We have a whole legal and justice system to deal with this. No kne needs to accept sexual abuse as a new normal. This shit is weird.

Thorny_Insight ,

I'm not saying there shouldn't be consequences for someone who is spreading these pictures with the intention to cause harm to someone's reputation but it's incredibly naive to think that the justice system is going to stop deepfakes when it can't even prevent bike theft. 12 year olds are making these with their smartphones. The technology is extremely accessible and easy to use and that is not going to change. I'm sorry but you're not putting the toothpaste back into the tube. Wait a few years and you can generate photorealistic porn videos of anyone you want.

echo64 ,

We can't stop biketheft so fuck off women, your free game coz this guy said so.

Thorny_Insight ,

When you start strawmanning you've already lost the argument.

echo64 ,

You might want to look up what strawmanning means. I'm just flat out mocking what you said.

Dkarma ,

No we don't. What is happening here is not covered by current laws.

0x0 ,

Sexual abuse?

Child pornography involves molesting a child and is a crime, as it should be.

Fake nudes have been a thing for ages and are only an issue if the targeted party takes offense. It may be slander but it's certainly not sexual abuse.

No one is accepting sexual abuse so drop it down a notch, Karen.

sbv ,

only an issue if the targeted party takes offense.

Deep fakes can change how the victim is treated by other people. Especially other kids.

Upthread, someone states

we're just going to need to accept as a new normal.

Which sounds a lot like accepting this kind of shit, regardless of what you call it.

eatthecake ,

From another comment:

To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

Try to imagine watching a realistic video of yourself being abused, imagine your mother watching. That will absolutely fuck some people up, and a lot of those victims are going to be children. Shit is going to get bad.

0x0 ,

I wouldn't put actual non-consensual pornography and fake pornography of any kind in the same bag but, geez, I'm not a Dr.

Deep fakes do improve on the (technical) realism over 90s photoshop for sure. Doesn't that still qualify as slander? (Also not a lawyer.)

eatthecake ,

The question is why, with an internet full of porn, do men want non consensual pornography that they know women are opposed to. It's as if the hurtfulness, the lack of consent and the control over the woman in the video are actually the point.

0x0 ,

Non-consensual pornography is called rape and it's a crime in most of the world.

roscoe ,

I think part of the difficulty discussing this is the discussions usually combine two different things. The production and distribution.

I was informed elsewhere in this thread people can already produce these images/videos on their own machines with no third parties involved or remote processing. I can't think of a single thing that can be done about that so acceptance is all we've got.

Nonconsensual sharing, on the other hand, we can and should do something about. The legal system won't be able to stop it altogether but it can push it to the fringes and stop it from becoming mainstream so any victims wouldn't see fake images/videos of themselves proliferating everywhere.

cley_faye ,

It's not a matter of excusing it. Distribution of someone's picture without their explicit consent, and anything like that, is inexcusable. But we're talking about the generation of said content, which technically can't be stopped without seriously restraining everything.

RobotToaster ,
@RobotToaster@mander.xyz avatar

This is only going to get easier. The djinn is out of the bottle.

goldteeth ,

"Djinn", specifically, being the correct word choice. We're way past fun-loving blue cartoon Robin Williams genies granting wishes, doing impressions of Jack Nicholson and getting into madcap hijinks. We're back into fuckin'... shapeshifting cobras woven of fire and dust by the archdevil Iblis, hiding in caves and slithering out into the desert at night to tempt mortal men to sin. That mythologically-accurate shit.

BrokenGlepnir ,

Have you ever seen the wishmaster movies?

db2 ,

Make. Your. Wishes.

roscoe ,

As soon as anyone can do this on their own machine with no third parties involved all laws and other measures being discussed will be moot.

We can punish nonconsensual sharing but that's about it.

CeeBee ,

As soon as anyone can do this on their own machine with no third parties involved

We've been there for a while now

roscoe ,

Some people can, I wouldn't even know where to start. And is the photo/video generator completely on home machines without any processing being done remotely already?

I'm thinking about a future where simple tools are available where anyone could just drop in a photo or two and get anything up to a VR porn video.

CeeBee ,

And is the photo/video generator completely on home machines without any processing being done remotely already?

Yes

roscoe ,

Well...shit. It seems like any new laws are already too little too late then.

JDPoZ , (edited )
@JDPoZ@lemmy.world avatar

Stable Diffusion has been easily locally installed and runnable on any decent GPU for 2 years at this point.

Combine that with Civitai.com for easy to download and run models of almost anything you can imagine - IP, celebrity, concepts, etc… and the possibilities have been endless.

In fact, with completely free apps like Draw Things on iOS, which allows you to run it on YOUR PHONE locally - where you can download models, tweak, customize, hand it images directly from your mobile device’s library… making this stuff is now trivial on the go.

T156 ,

Tensor processors/AI accelerators have also been a thing on new hardware for a while. Mobile devices have them, Intel/Apple include them with their processors, and it's not uncommon to find them on newer graphics cards.

That would just make it easier compared to needing quite a powerful computer for that kind of task.

neptune ,

I can paint as many nude images of Rihanna as I want.

yildolw ,

You may be sued for damages if you sell those nude paintings of Rihanna at a large enough scale that Rihanna notices

conciselyverbose ,

Doesn't mean distribution should be legal.

People are going to do what they're going to do, and the existence of this isn't an argument to put spyware on everyone's computer to catch it or whatever crazy extreme you can take it to.

But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

ITGuyLevi ,

While I agree in spirit, any law surrounding it would need to be very clearly worded, with certain exceptions carved out. Which I'm sure wouldn't happen.

I could easily see people thinking something was of them, when in reality it was of someone else.

treadful ,
@treadful@lemmy.zip avatar

Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.

cley_faye ,

I'm not familiar with the US laws, but… isn't it already some form of crime or something to distribute nude of someone without their consent? This should not change whether AI is involved or not.

T156 ,

It might depend on whether fabricating them wholesale would be considered a nude or not. Legally, it could be considered a different person if you're making it, since the "nude" is someone else, and you're putting their face on top, or it's a complete fabrication made by a computer.

Unclear if it would still count if it was someone else and they were lying about it being the victim, for example, pretending a headless mirror-nude was sent by the victim, when it was sent by someone else.

JackGreenEarth ,
@JackGreenEarth@lemm.ee avatar

That's a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they're purposefully exploiting people who aren't tech savvy.

M500 ,

Wait? This is a tool built into stable diffusion?

In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.

SorteKanin ,
@SorteKanin@feddit.dk avatar

It's not like deep fake pornography is "built in" but Stable Diffusion can take existing images and generate stuff based on it. That's kinda how it works really. The de-facto standard UI makes it pretty simple, even for someone who's not too tech savvy: https://github.com/AUTOMATIC1111/stable-diffusion-webui

M500 ,

Thanks for the link, I’ve been running some llm locally, and I have been interested in stable diffusion. I’m not sure I have the specs for it at the moment though.

TheRealKuni ,

An iPhone from 2018 can run Stable Diffusion. You can probably run it on your computer. It just might not be very fast.

TheRealKuni ,

By the way, if you’re interested in Stable Diffusion and it turns out your computer CAN’T handle it, there are sites that will let you toy around with it for free, like civitai. They host an enormous number of models and many of them work with the site’s built in generation.

Not quite as robust as running it locally, but worth trying out. And much faster than any of the ancient computers I own.

bassomitron ,

Img2img isn't always spot-on with what you want it to do, though. I was making extra pictures for my kid's bedtime books that we made together and it was really hit or miss. I've even goofed around with my own pictures to turn myself into various characters and it doesn't work out like you want it to much of the time. I can imagine it's the same when going for porn, where you'd need to do numerous iterations and tweaking over and over to get the right look/facsimile. There are tools/SD plugins like Roop which does make transferring over faces with img2img easier and more reliable, but even then it's still not perfect. I haven't messed around with it in several months, so maybe it's better and easier now.

BlackPenguins ,

It depends on the models you use too. There's specific training models data out there and all you need to do is give it a prompt of "naked" or something and it's scary good at making something realistic in 2 minutes. But yeah, there is a learning curve at setting everything up.

Khrux , (edited )

I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.

I wish everyone involved in this use of AI a very awful day.

sentient_loom ,
@sentient_loom@sh.itjust.works avatar

Imagine hiring a hit man and then realizing he hired another hit man at half the price. I think the government should compensate them.

brbposting ,
sentient_loom ,
@sentient_loom@sh.itjust.works avatar

Nested hit man scalpers taking advantage of overpaying client.

OKRainbowKid ,

In my experience with SD, getting images that aren't obviously "wrong" in some way takes multiple iterations with quite some time spent tuning prompts and parameters.

echo64 ,

The people being exploited are the ones who are the victims of this, not people who paid for it.

sentient_loom ,
@sentient_loom@sh.itjust.works avatar

There are many victims, including the perpetrators.

sbv ,

It seems like there's a news story every month or two about a kid who kills themselves because videos of them are circulating. Or they're being blackmailed.

I have a really hard time thinking of the people who spend ten bucks making deep fakes of other people as victims.

sentient_loom ,
@sentient_loom@sh.itjust.works avatar

I have a really hard time thinking

Your lack of imagination doesn't make the plight of non-consensual AI-generated porn artists any less tragic.

Vanth , (edited )
@Vanth@reddthat.com avatar

[Thread, post or comment was deleted by the author]

  • Loading...
  • sentient_loom ,
    @sentient_loom@sh.itjust.works avatar

    Writing /s would have implied that my fellow lemurs don't get jokes, and I give them more credit than that.

    Vanth , (edited )
    @Vanth@reddthat.com avatar

    [Thread, post or comment was deleted by the author]

  • Loading...
  • sentient_loom ,
    @sentient_loom@sh.itjust.works avatar

    Some people just don't have a sense of humor.

    And those people are YOU!!

    Thanks for the finger-wagging, you moralistic rapist!

    brbposting ,

    My sarcasm detector is between 8.5-9.5 outta ten.

    Missed it this time, FWIW!

    Dkarma ,

    No one's a victim no one's being exploited. Same as taping a head on a porno mag.

    IsThisAnAI ,

    Scam is another thing. Fuck these people selling.

    But fuck dude they aren't taking advantage of anyone buying the service. That's not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

    NOBODY on that side of the equation are bring exploited 🤣

    istanbullu ,

    it's a "I don't know tech" tax

    sugar_in_your_tea ,

    IDK, $10 seems pretty reasonable to run a script for someone who doesn't want to. A lot of people have that type of arrangement for a job...

    That said, I would absolutely never do this for someone, I'm not making nudes of a real person.

    oce ,
    @oce@jlai.lu avatar

    That's like 80% of the IT industry.

    ColeSloth ,

    And mechanics exploit people needing brake jobs. What's your point?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • incremental_games
  • meta
  • All magazines