A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
This is something I can’t quite get through to my wife. She does not like that I dismiss things to some degree when it does not makes sense. We get into these convos where Im like I have serious doubts about this and she is like. Are you saying it did not happen and im like. no. It may have happened but not in quite the way they say or its being portrayed in a certain manner. Im still going to take video and photos for now as being likely true but I generally want to see it from independent sources. like different folks with their phones along with cctv of some kind and such.
Ok so pay the dude $10 to put your wife’s head on someone agreeing with you. Problem solved.
I didn’t expect to get a laugh out of reading this discussion, thanks.
lol. there you go. hey you cheated on me. its in this news article right here.