• Lenguador@kbin.social
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    According to consequentialism:

    1. Imagining sexual fantasies in one’s own mind is fine.
    2. Any action which affects no-one but the actor, such as manifesting those fantasies, is also fine.
    3. Distributing non-consensual pornography publicly is not fine.
    4. Distributing tools for the purpose of non-consensual pornography is a grey area (enables (2), which is permissible, and (3), which is not).

    From this perspective, the only issue one could have with deep fakes is the distribution of pornography which should only be used privately. The author dismisses this take as “few people see his failure to close the tab as the main problem”. I guess I am one of the few.

    Another perspective is to consider the pornography itself to be impermissible. Which, as the author notes, implies that (1) is also impermissible. Most would agree (1) is morally fine (some may consider it disgusting, but that doesn’t make it immoral).

    In the author’s example of Ross teasing Rachel, the author concludes that the imagining is the moral quandry, as opposed to the teasing itself. Drinking water isn’t amoral. Sending a video of drinking water isn’t amoral. But sending that video to someone dying of thirst is.

    The author’s conclusion is also odd:

    Today, it is clear that deepfakes, unlike sexual fantasies, are part of a systemic technological degrading of women that is highly gendered (almost all pornographic deepfakes involve women) […] Fantasies, on the other hand, are not gendered […]

    1. Could you not also equally claim that women are being worshipped instead of degraded? Only by knowing the mind of both the consumer and the model can you determine which is happening. And of course each could have different perspectives.
    2. If there were equal amounts of deep fakes of men as women, the conclusion implies that deep fakes would be fine (as that is the only distinction drawn), which is probably not the author’s intention.
    3. I take issue with the use of systemic. The purpose of deep fakes is for sexual gratification of the user, not degradation. Only if you consider being the object of focus for sexual gratification to be degradation could the claim that there is anything systemic. If it was about degradation, wouldn’t consumers be trying to notify targeted people of their deep fake videos and make them as public as possible?
    4. Singling out “women” as a group is somewhat disingenuous. Women are over-represented in all pornography because the majority of consumers are men and the majority of men are only attracted to women. This is quite clear as ugly women aren’t likely to be targeted. It’s not about “being a woman”, it’s about “being attractive to pornography consumers”. I think to claim “degradation of women” with the caveat that “half of women won’t be affected, and also a bunch of attractive males will be” makes the claim vacuous.
    • If something is amoral then it has no moral implications and is neither good or bad to do. If something is morally bad to do then it is immortal. It’s a common misuse and pointing it out may be a bit pedantic because choices with no moral implications are rarely considered or meaningful but I have to use my degree in applied ethics somehow.

      • Lenguador@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Haha, thanks for the correction. If you have to use your degree in ethics, perhaps you could add your perspective to the thread?

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I don’t agree that both parties have to agree that something is degrading for something to be considered degrading. When someone considers it to be degrading that their likelihood, voice, way to talk etc. is used to produce porn, I would say it is degrading to them regardless of whether or not that porn is sent around.

      I also think it has an effect on how you treat other people. Especially those you use to produce porn. There will probably never be a study about this, though. Because I guess it would be seen as unethical to test. Which already speaks for itself…

  • BlameThePeacock@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    I think the end result is going to be people simply not caring about their own nudity in images. It will no longer be considered a socially private thing, and therefore it will no longer be an impingement on the person being depicted.

    We’re already starting to see hints of this in some cases because with deepfakes becoming harder and harder to detect if someone releases naked images of you in a way designed to hurt you, you can simply say that it’s a fake and the social implications you would normally associate with such a release of private material will be significantly reduced or eliminated entirely. This result will become more and more common the longer this technology exists, and once an entire generation grows up with this being part of their culture it will simply no longer be a problem for them.

  • Can-Utility@beehaw.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    Lots to chew over in this piece. My knee-jerk reaction is to be opposed to any efforts to legislate against thoughtcrime, but I’m not insensitive to the effect deepfakes can have on the women targeted. Yet even saying “legislate” in the previous sentence isn’t quite right (nobody’s suggesting consumers of deepfakes should be prosecuted and imprisoned); what the article seems to suggest is a societal shift in approval vs. disapproval of one’s imagination — which is still alarming at a high level, but less so.

    I also wonder if focusing on deepfakes as a unique problem isn’t a category error; AI is making all manner of false scenarios appear photorealistic, with ramifications society-wide. Maybe we need to confront the usage of this technology in general? IDK.

  • CanadaPlus@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    I’ve thought a bit about this lately. There’s efforts to classify deepfake use as a sex crime, and apparently seeing your own can be traumatising, but at the same time there’s a whole continuum. Deepfakes - fan art - fan fics - impure thoughts. Few people want to ban them all, even if it was practical which it isn’t, but it’s also hard to tell someone that people should be allowed making fucked-up porn starring them.

    I’ve yet to hear an actual philosophical argument about where the line should be. It’s nice to see a discussion, at least, rather than just rhetoric.

  • PenguinTD@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    That was a very long article trying to “carefully” talking about porn. The thing is it’s “NSFW” is what the guy did wrong, he watch porn on working computer. All the rest is bullshit cause even back before computer exists, we have “magazine clip” fake, aka, cut carefully chosen head off celebrity and paste it on to whatever your local playboy variant. Now we just have tech that can paste any head to any video.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      It does mention that, actually:

      And few people see his failure to close the tab as the main problem.

      I imagine that also applies to the crime of working on a personal computer or vice-versa.

      • PenguinTD@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        welp, I admit it was too long I barely read the whole thing, kinda skimp through it.

        But yeah, keep work/personal pc separated is essential.

  • Lyrl@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    It’s not just visual deepfakes, either. There are large language models out there now that can be trained on a person’s writings to chat “in their style”. Having such a model act like an acquaintance but prompted to express love and/or NSFW fantasies is a similar moral area to the deepfakes visual porn.

  • Hotchpotch@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’m not surprised that the author of a article in favour of deep fake porn (although disguised as a pilosophical “centrist” view) is male.