Y’know, I think the only silver lining to deepfakes is that if my nudes get leaked I could just brush it off and say they’re fake.
Of course, that’s easy to say until you’re slapped in the face with your own bits
Not enough ad impressions when you give the user what they’re looking for too quickly. ChatGPT will probably go that way eventually when the investor money runs out.
This is a good example of what happens when technology moves faster than our governments can legislate. We don’t live in a world where companies genuinely care about what affect their products will have on people. We rely on the government to care for us. Problem is, even when the government actually does the needed actions, it’s usually too late.
Unfortunately it’s what happens when the policymakers are out of touch with the nuanced realities of industry. Either they make policy based on very narrow definitions, or they get stuck debating issues while the industries evolve beyond the issue being debated.
I can only imagine how fucked gender relations would have to already be for this to be a normal thing to consider doing publicly.
You know what’s kind of odd. These deep fake things people are using are usually online ones. As far as I know, there aren’t any public models that are being used for deep fakes. I’m sure you could make them with the right hardware, but that’s a lot of heavy lifting.
There’s plenty of generative AI models for creating porn. To me there’s a red flag people are ignoring by using the deep fake apps/websites. It gets tied to the uploader whether they say it’s private or not.
It’s pretty fucked up. But maybe the solution is to make nudity more acceptable in the society. It’s not a coincidence that South Korea is such a big target for it because sex and nudity are so taboo there. The US is not that much better, but enough that it doesn’t feel as invasive to as much of the population and thus it doesn’t have as much value to people using it to shame and control women. But of course that takes generations to change and would require a lot of women to do unsafe things to affect the change. So it’s not something that will happen quickly.
I think it’s still fucked up even in a less taboo world.
Like instead of receiving an unsolicited dick pic, you receive an unsolicited deepfaked version of you getting dicked is all kinds of fucked up on top of just the nudity.
Fight back with porn. Show men with small weiners sobbing uncontrollably, you know, showing non-masculine emotions and traits. Let them deal with their own body problems.
honestly with the pervasion of massive unrealistic dongs i wouldn’t mind seeing smaller ones in generated porn.
i am horrified at all of this, im just trying to find the silver lining.
I’m not sure I understand how deepfake porn is supposed to be ruining lives here. From the article it seems like the issue is not any concern that it would be mistaken for real, but instead just people having a very horrified reaction to seeing that sort of depiction of themselves? Mostly it seems like the deepfake aspect is sort of a trivial distraction to the real issue on display there of gangs of men targeting random women for online harassment. If there was no such thing as deepfakes other sexually explicit or disturbing images could sub in easily enough.
Maybe the new deepfake ban will be useful as a way of going after these harassment gangs that previously didn’t face legal consequences? But it’s sort of an inexact tool for that job, given that there are presumably lots of deepfake images out there not used for harassment, and that’s it’s easy enough for harassers to switch away from deepfakes if using them becomes a major legal vulnerability.
Associated Press - News Source Context (Click to view Full Report)
Information for Associated Press:
MBFC: Left-Center - Credibility: High - Factual Reporting: High - United States of America
Wikipedia about this sourceSearch topics on Ground.News
https://apnews.com/article/south-korea-deepfake-porn-women-df98e1a6793a245ac14afe8ec2366101