Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

@Glass0448@lemmy.today cover
@Glass0448@lemmy.today avatar

Glass0448

@Glass0448@lemmy.today

Philip answered him, 2 books is not sufficient for them. And Jesus took the books; and when he had given thanks, he distributed to the disciples, and the disciples to them that were set down. Therefore they gathered them together, and filled twelve baskets with the new copies, which remained over.

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Glass0448 , to Technology in Microsoft has blocked the bypass that allowed you to create a local account during Windows 11 setup by typing in a blocked email address
@Glass0448@lemmy.today avatar

stuck on decrappified windows for the immediate future.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

Everybody is American. They just don't know it yet.

Gospel of the Jesus

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

It seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I've heard from others, it is used to simplify prosecution. PedoAnon can't argue "it's a deepfake, not a real kid" to the SWAT team.

There is a massive disconnect between what we should be seeing, and what we are seeing. I assume because the authorities who moderate this shit almost exclusively go after real CSAM, on account of it actually being a literal offense, as opposed to drawn CSAM, being a proxy offense.
This can be attributed to no proper funding of CSAM enforcement. Pedos get picked up if they become an active embarrassment like the article dude. Otherwise all the money is just spent on the database getting bigger and keeping the lights on. Which works for congress. A public pedo gets nailed to the wall because of the database, the spooky spectre of the pedo out for your kids remains, vote for me please....

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

And also it's an AI.

13k images before AI involved a human with Photoshop or a child doing fucked up shit.

13k images after AI is just forgetting to turn off the CSAM auto-generate button.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

Stable Diffusion has been distancing themselves from this. The model that allows for this was leaked from a different company.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar
Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

Making the CSAM is illegal by itself https://www.thefederalcriminalattorneys.com/possession-of-lolicon

Title is pretty accurate.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar
Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

so many people still think it should be illegal

It is illegal. https://www.thefederalcriminalattorneys.com/possession-of-lolicon

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar
Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

It would be illegal in the United States. Artistic depictions of CSAM are illegal under the PROTECT act 2003.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

Asked whether more funding will be provided for the anti-paint enforcement divisions: it's such a big backlog, we'll rather just wait for somebody to piss of a politician to focus our resources.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

Simulated crimes aren’t crimes.

Artistic CSAM is definitely a crime in the United States. PROTECT act of 2003.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.

The Protect Act of 2003 means that any artistic depiction of CSAM is illegal. The guidance is pretty clear, FBI is gonna raid your house.....eventually. We still haven't properly funded the anti-CSAM departments.

Glass0448 , to Technology in FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
@Glass0448@lemmy.today avatar

OMG. Every other post is saying their disgusted about the images part but it's a grey area, but he's definitely in trouble for contacting a minor.

Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.

https://www.thefederalcriminalattorneys.com/possession-of-lolicon

https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines